* Added support for defining properties for operator buttons, with
uiButGetOperatorPtrRNA. Needed to cleanup a hack that was there
for operator properties in RNA, now a separate OperatorProperties
type is used for storing operator properties, instead of being part
of the Operator type itself.
* Allow selecting menu items with mouse release instead of press again.
* Fix some cases with hanging tooltips in the UI.
More notifier cleanup;
- removed view2d sync notifier, its data operations are too complex
for UI hints/notes, direct calls work too :)
- updated missing gpl header in region file
Noticed weird delay on menu refreshing now... will check.
operators. RNA property buttons will automatically fill in the label, min/max,
etc if they are not specified. Operator menu buttons will look up the key
combination in the handlers and add it automatically.
uiDefButR, uiDefIconButR, uiDefIconTextButR
uiDefButO, uiDefIconButO, uiDefIconTextButO
uiDefButO takes a context pointer to do the key lookup, don't really like this..
* Added context pointer to function callbacks for buttons and blocks.
* Added a uiBlockSetHandleFunc, which will be called with the button
return value. This seems kind of redundant with uiBlockSetButmFunc,
but the latter uses a2 to pass the value.
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
* Fix sensor data pointer code, also made sensor type non editable,
would need to do more work than just setting the type.
* Fix a fairly obscure bug related to inheritance and sorting.
* Complete DNA_ID.h wrapping, just a few extra properties and the
Library struct, most of this is internal.
* Disable editable pointers for now, difficult to support well.
* Swap parameters in RNA_access.h functions to make it more
consistent.
* Rename rna members for operators to wmOperatorType.srna, and
wmOperator.ptr, to make the distincton a bit clearer.
• Removed the RNA_int_default and similar functions, they're too
confusing. RNA_property_is_set can still be used to achieve
the same goal.
* Add functions to create RNA pointers.
Some example code for RNA data access and operator properties:
http://wiki.blender.org/index.php/BlenderDev/Blender2.5/RNAExampleCode
* Wrap most user editable data in DNA_mesh_types.h and DNA_meshdata_types.h.
Still needs to be improved in some areas though, especially how to deal
with data layers (expose per element or as array, or both?), and data in
face corners (bmesh type data structures are more logical here).
Tweaks to RNA defining to make some cases easier:
* Added range callback function for int/float.
* Added 'skip' callback for listbase and array iterators to skip items in
the collection.
* Extra error print when calling wrong define_property_*_sdna functions.
* Also made button code respect non_editable flag, is quick change though,
need to add support for properly graying out etc.
* Added support for ID properties, mapped as follows:
* IDP Int = RNA Int
* IDP Float, Double = RNA Float
* IDP_String = RNA String
* IDP Group = RNA IDPropertyGroup Struct
* IDP_Array = RNA Array
* PropertyRNA and StructRNA are now defined private for the module,
to force external code to always use accessor functions.
* Rename cname to identifier.
* Rename PropertyEnumItem to EnumPropertyItem.
* Wrapped min/max/step/precision, pointer type for RNA.
* Draw FLT_MAX a bit better in buttons.
* Added RNA list viewer. This is currently drawn in the outliner
window, the UI is limited but it is just intended to test RNA
at the moment.
* Added UI names for currently wrapped properties.
* Made iterating collections a bit more convenient.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.