Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
/**
|
|
|
|
* $Id$
|
|
|
|
*
|
|
|
|
* ***** BEGIN GPL LICENSE BLOCK *****
|
|
|
|
*
|
|
|
|
* This program is free software; you can redistribute it and/or
|
|
|
|
* modify it under the terms of the GNU General Public License
|
|
|
|
* as published by the Free Software Foundation; either version 2
|
|
|
|
* of the License, or (at your option) any later version.
|
|
|
|
*
|
|
|
|
* This program is distributed in the hope that it will be useful,
|
|
|
|
* but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
|
|
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
|
|
* GNU General Public License for more details.
|
|
|
|
*
|
|
|
|
* You should have received a copy of the GNU General Public License
|
|
|
|
* along with this program; if not, write to the Free Software Foundation,
|
|
|
|
* Inc., 59 Temple Place - Suite 330, Boston, MA 02111-1307, USA.
|
|
|
|
*
|
|
|
|
* The Original Code is Copyright (C) 2001-2002 by NaN Holding BV.
|
|
|
|
* All rights reserved.
|
|
|
|
*
|
|
|
|
* The Original Code is: all of this file.
|
|
|
|
*
|
|
|
|
* Contributor(s): none yet.
|
|
|
|
*
|
|
|
|
* ***** END GPL LICENSE BLOCK *****
|
|
|
|
*/
|
|
|
|
|
|
|
|
#ifndef UI_INTERFACE_H
|
|
|
|
#define UI_INTERFACE_H
|
|
|
|
|
|
|
|
struct ID;
|
|
|
|
struct ListBase;
|
|
|
|
struct ARegion;
|
|
|
|
struct wmWindow;
|
|
|
|
struct wmWindowManager;
|
2008-12-24 14:52:17 +00:00
|
|
|
struct wmOperator;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
struct AutoComplete;
|
|
|
|
struct bContext;
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
struct PointerRNA;
|
|
|
|
struct PropertyRNA;
|
2008-12-29 13:38:08 +00:00
|
|
|
struct ReportList;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
/* uiBlock->dt */
|
|
|
|
#define UI_EMBOSS 0 /* use one of the themes for drawing */
|
|
|
|
#define UI_EMBOSSN 1 /* Nothing */
|
|
|
|
#define UI_EMBOSSM 2 /* Minimal builtin emboss, also for logic buttons */
|
|
|
|
#define UI_EMBOSSP 3 /* Pulldown */
|
|
|
|
#define UI_EMBOSSR 4 /* Rounded */
|
|
|
|
#define UI_EMBOSST 5 /* Table */
|
|
|
|
|
|
|
|
#define UI_EMBOSSX 0 /* for a python file, which i can't change.... duh! */
|
|
|
|
|
|
|
|
/* uiBlock->direction */
|
|
|
|
#define UI_TOP 1
|
|
|
|
#define UI_DOWN 2
|
|
|
|
#define UI_LEFT 4
|
|
|
|
#define UI_RIGHT 8
|
|
|
|
#define UI_DIRECTION 15
|
|
|
|
#define UI_CENTER 16
|
|
|
|
#define UI_SHIFT_FLIPPED 32
|
|
|
|
|
|
|
|
/* uiBlock->autofill */
|
|
|
|
#define UI_BLOCK_COLLUMNS 1
|
|
|
|
#define UI_BLOCK_ROWS 2
|
|
|
|
|
|
|
|
/* uiBlock->flag (controls) */
|
|
|
|
#define UI_BLOCK_LOOP 1
|
|
|
|
#define UI_BLOCK_REDRAW 2
|
|
|
|
#define UI_BLOCK_RET_1 4 /* XXX 2.5 not implemented */
|
|
|
|
#define UI_BLOCK_NUMSELECT 8
|
|
|
|
#define UI_BLOCK_ENTER_OK 16
|
|
|
|
#define UI_BLOCK_NOSHADOW 32
|
|
|
|
#define UI_BLOCK_NO_HILITE 64 /* XXX 2.5 not implemented */
|
|
|
|
#define UI_BLOCK_MOVEMOUSE_QUIT 128
|
|
|
|
#define UI_BLOCK_KEEP_OPEN 256
|
|
|
|
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
/* uiMenuBlockHandle->menuretval */
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
#define UI_RETURN_CANCEL 1 /* cancel all menus cascading */
|
|
|
|
#define UI_RETURN_OK 2 /* choice made */
|
|
|
|
#define UI_RETURN_OUT 4 /* left the menu */
|
|
|
|
|
|
|
|
/* block->flag bits 12-15 are identical to but->flag bits */
|
|
|
|
|
|
|
|
/* block->font, for now: bold = medium+1 */
|
|
|
|
#define UI_HELV 0
|
|
|
|
#define UI_HELVB 1
|
|
|
|
|
|
|
|
/* panel controls */
|
|
|
|
#define UI_PNL_TRANSP 1
|
|
|
|
#define UI_PNL_SOLID 2
|
|
|
|
|
|
|
|
#define UI_PNL_CLOSE 32
|
|
|
|
#define UI_PNL_STOW 64
|
|
|
|
#define UI_PNL_TO_MOUSE 128
|
|
|
|
#define UI_PNL_UNSTOW 256
|
|
|
|
#define UI_PNL_SCALE 512
|
|
|
|
|
|
|
|
/* warning the first 4 flags are internal */
|
|
|
|
/* but->flag */
|
|
|
|
#define UI_TEXT_LEFT 16
|
|
|
|
#define UI_ICON_LEFT 32
|
|
|
|
#define UI_ICON_RIGHT 64
|
|
|
|
/* control for button type block */
|
|
|
|
#define UI_MAKE_TOP 128
|
|
|
|
#define UI_MAKE_DOWN 256
|
|
|
|
#define UI_MAKE_LEFT 512
|
|
|
|
#define UI_MAKE_RIGHT 1024
|
|
|
|
/* dont draw hilite on mouse over */
|
|
|
|
#define UI_NO_HILITE 2048
|
|
|
|
/* button align flag, for drawing groups together */
|
|
|
|
#define UI_BUT_ALIGN (15<<12)
|
|
|
|
#define UI_BUT_ALIGN_TOP (1<<12)
|
|
|
|
#define UI_BUT_ALIGN_LEFT (1<<13)
|
|
|
|
#define UI_BUT_ALIGN_RIGHT (1<<14)
|
|
|
|
#define UI_BUT_ALIGN_DOWN (1<<15)
|
2009-01-04 02:09:41 +00:00
|
|
|
#define UI_BUT_DISABLED (1<<16)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
/* Button types, bits stored in 1 value... and a short even!
|
|
|
|
- bits 0-4: bitnr (0-31)
|
|
|
|
- bits 5-7: pointer type
|
|
|
|
- bit 8: for 'bit'
|
|
|
|
- bit 9-15: button type (now 6 bits, 64 types)
|
|
|
|
*/
|
|
|
|
|
|
|
|
#define CHA 32
|
|
|
|
#define SHO 64
|
|
|
|
#define INT 96
|
|
|
|
#define FLO 128
|
|
|
|
#define FUN 192
|
|
|
|
#define BIT 256
|
|
|
|
|
|
|
|
#define BUTPOIN (128+64+32)
|
|
|
|
|
|
|
|
#define BUT (1<<9)
|
|
|
|
#define ROW (2<<9)
|
|
|
|
#define TOG (3<<9)
|
|
|
|
#define SLI (4<<9)
|
|
|
|
#define NUM (5<<9)
|
|
|
|
#define TEX (6<<9)
|
|
|
|
#define TOG3 (7<<9)
|
|
|
|
#define TOGR (8<<9)
|
|
|
|
#define TOGN (9<<9)
|
|
|
|
#define LABEL (10<<9)
|
|
|
|
#define MENU (11<<9)
|
|
|
|
#define ICONROW (12<<9)
|
|
|
|
#define ICONTOG (13<<9)
|
|
|
|
#define NUMSLI (14<<9)
|
|
|
|
#define COL (15<<9)
|
|
|
|
#define IDPOIN (16<<9)
|
|
|
|
#define HSVSLI (17<<9)
|
|
|
|
#define SCROLL (18<<9)
|
|
|
|
#define BLOCK (19<<9)
|
|
|
|
#define BUTM (20<<9)
|
|
|
|
#define SEPR (21<<9)
|
|
|
|
#define LINK (22<<9)
|
|
|
|
#define INLINK (23<<9)
|
|
|
|
#define KEYEVT (24<<9)
|
|
|
|
#define ICONTEXTROW (25<<9)
|
|
|
|
#define HSVCUBE (26<<9)
|
|
|
|
#define PULLDOWN (27<<9)
|
|
|
|
#define ROUNDBOX (28<<9)
|
|
|
|
#define CHARTAB (29<<9)
|
|
|
|
#define BUT_COLORBAND (30<<9)
|
|
|
|
#define BUT_NORMAL (31<<9)
|
|
|
|
#define BUT_CURVE (32<<9)
|
|
|
|
#define BUT_TOGDUAL (33<<9)
|
|
|
|
#define ICONTOGN (34<<9)
|
|
|
|
#define FTPREVIEW (35<<9)
|
|
|
|
#define NUMABS (36<<9)
|
|
|
|
#define BUTTYPE (63<<9)
|
|
|
|
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
typedef struct uiBut uiBut;
|
|
|
|
typedef struct uiBlock uiBlock;
|
|
|
|
|
|
|
|
/* Common Drawing Functions */
|
|
|
|
|
|
|
|
void uiEmboss(float x1, float y1, float x2, float y2, int sel);
|
|
|
|
void uiRoundBox(float minx, float miny, float maxx, float maxy, float rad);
|
|
|
|
void uiSetRoundBox(int type);
|
|
|
|
void uiRoundRect(float minx, float miny, float maxx, float maxy, float rad);
|
2009-01-04 00:05:40 +00:00
|
|
|
void uiDrawMenuBox(float minx, float miny, float maxx, float maxy, short flag, short direction);
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
void uiDrawBoxShadow(unsigned char alpha, float minx, float miny, float maxx, float maxy);
|
|
|
|
|
|
|
|
/* Popup Menu's */
|
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
typedef struct uiMenuBlockHandle {
|
2008-12-15 19:19:39 +00:00
|
|
|
/* internal */
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
struct ARegion *region;
|
2008-12-15 19:19:39 +00:00
|
|
|
int towardsx, towardsy;
|
|
|
|
double towardstime;
|
|
|
|
int dotowards;
|
|
|
|
|
|
|
|
int popup;
|
|
|
|
void (*popup_func)(struct bContext *C, void *arg, int event);
|
|
|
|
void *popup_arg;
|
2008-12-24 14:52:17 +00:00
|
|
|
/* for operator menus */
|
|
|
|
struct wmOperator *op_arg;
|
|
|
|
const char *propname;
|
|
|
|
|
2008-12-15 19:19:39 +00:00
|
|
|
/* return values */
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
int butretval;
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
int menuretval;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
float retvalue;
|
|
|
|
float retvec[3];
|
|
|
|
} uiMenuBlockHandle;
|
|
|
|
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
typedef uiBlock* (*uiBlockFuncFP)(struct bContext *C, struct uiMenuBlockHandle *handle, void *arg1);
|
2008-12-15 19:19:39 +00:00
|
|
|
typedef void (*uiPupmenuFunc)(struct bContext *C, void *arg, int event);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2008-12-17 16:53:56 +00:00
|
|
|
void uiPupmenuSetActive(int val);
|
2008-12-15 19:19:39 +00:00
|
|
|
|
2008-12-24 14:52:17 +00:00
|
|
|
void uiPupmenuOperator(struct bContext *C, int maxrow, struct wmOperator *op, const char *propname, char *str);
|
|
|
|
void uiPupmenu(struct bContext *C, int maxrow, uiPupmenuFunc func, void *arg, char *str, ...);
|
2008-12-17 16:53:56 +00:00
|
|
|
void uiPupmenuOkee(struct bContext *C, char *opname, char *str, ...);
|
|
|
|
void uiPupmenuSaveOver(struct bContext *C, char *opname, char *filename, ...);
|
|
|
|
void uiPupmenuNotice(struct bContext *C, char *str, ...);
|
|
|
|
void uiPupmenuError(struct bContext *C, char *str, ...);
|
2008-12-29 13:38:08 +00:00
|
|
|
void uiPupmenuReports(struct bContext *C, struct ReportList *reports);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
/* Block */
|
|
|
|
|
|
|
|
uiBlock *uiBeginBlock(const struct bContext *C, struct ARegion *region, char *name, short dt, short font);
|
|
|
|
void uiEndBlock(const struct bContext *C, uiBlock *block);
|
2008-12-26 13:11:04 +00:00
|
|
|
void uiDrawBlock(const struct bContext *C, struct uiBlock *block);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
uiBlock *uiGetBlock(char *name, struct ARegion *ar);
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
void uiFreeBlock(const struct bContext *C, uiBlock *block);
|
|
|
|
void uiFreeBlocks(const struct bContext *C, struct ListBase *lb);
|
2008-12-26 13:11:04 +00:00
|
|
|
void uiFreeInactiveBlocks(const struct bContext *C, struct ListBase *lb);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
void uiBoundsBlock(struct uiBlock *block, int addval);
|
|
|
|
void uiTextBoundsBlock(uiBlock *block, int addval);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
void uiBlockSetButLock(uiBlock *block, int val, char *lockstr);
|
|
|
|
void uiBlockClearButLock(uiBlock *block);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
/* automatic aligning, horiz or verical */
|
|
|
|
void uiBlockBeginAlign(uiBlock *block);
|
|
|
|
void uiBlockEndAlign(uiBlock *block);
|
|
|
|
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
/* Misc */
|
|
|
|
|
|
|
|
void uiSetCurFont(uiBlock *block, int index);
|
|
|
|
void *uiSetCurFont_ext(float aspect);
|
|
|
|
void uiDefFont(unsigned int index, void *xl, void *large, void *medium, void *small);
|
|
|
|
|
|
|
|
void uiComposeLinks(uiBlock *block);
|
|
|
|
uiBut *uiFindInlink(uiBlock *block, void *poin);
|
|
|
|
|
|
|
|
void uiBlockPickerButtons(struct uiBlock *block, float *col, float *hsv, float *old, char *hexcol, char mode, short retval);
|
|
|
|
|
|
|
|
/* Defining Buttons */
|
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
uiBut *uiDefBut(uiBlock *block,
|
|
|
|
int type, int retval, char *str,
|
|
|
|
short x1, short y1,
|
|
|
|
short x2, short y2,
|
|
|
|
void *poin,
|
|
|
|
float min, float max,
|
|
|
|
float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefButF(uiBlock *block, int type, int retval, char *str, short x1, short y1, short x2, short y2, float *poin, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefButBitF(uiBlock *block, int type, int bit, int retval, char *str, short x1, short y1, short x2, short y2, float *poin, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefButI(uiBlock *block, int type, int retval, char *str, short x1, short y1, short x2, short y2, int *poin, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefButBitI(uiBlock *block, int type, int bit, int retval, char *str, short x1, short y1, short x2, short y2, int *poin, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefButS(uiBlock *block, int type, int retval, char *str, short x1, short y1, short x2, short y2, short *poin, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefButBitS(uiBlock *block, int type, int bit, int retval, char *str, short x1, short y1, short x2, short y2, short *poin, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefButC(uiBlock *block, int type, int retval, char *str, short x1, short y1, short x2, short y2, char *poin, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefButBitC(uiBlock *block, int type, int bit, int retval, char *str, short x1, short y1, short x2, short y2, char *poin, float min, float max, float a1, float a2, char *tip);
|
2008-12-16 07:55:43 +00:00
|
|
|
uiBut *uiDefButR(uiBlock *block, int type, int retval, char *str, short x1, short y1, short x2, short y2, struct PointerRNA *ptr, const char *propname, int index, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefButO(struct bContext *C, uiBlock *block, int type, char *opname, int opcontext, char *str, short x1, short y1, short x2, short y2, char *tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
uiBut *uiDefIconBut(uiBlock *block,
|
|
|
|
int type, int retval, int icon,
|
|
|
|
short x1, short y1,
|
|
|
|
short x2, short y2,
|
|
|
|
void *poin,
|
|
|
|
float min, float max,
|
|
|
|
float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefIconButF(uiBlock *block, int type, int retval, int icon, short x1, short y1, short x2, short y2, float *poin, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefIconButBitF(uiBlock *block, int type, int bit, int retval, int icon, short x1, short y1, short x2, short y2, float *poin, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefIconButI(uiBlock *block, int type, int retval, int icon, short x1, short y1, short x2, short y2, int *poin, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefIconButBitI(uiBlock *block, int type, int bit, int retval, int icon, short x1, short y1, short x2, short y2, int *poin, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefIconButS(uiBlock *block, int type, int retval, int icon, short x1, short y1, short x2, short y2, short *poin, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefIconButBitS(uiBlock *block, int type, int bit, int retval, int icon, short x1, short y1, short x2, short y2, short *poin, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefIconButC(uiBlock *block, int type, int retval, int icon, short x1, short y1, short x2, short y2, char *poin, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefIconButBitC(uiBlock *block, int type, int bit, int retval, int icon, short x1, short y1, short x2, short y2, char *poin, float min, float max, float a1, float a2, char *tip);
|
2008-12-16 07:55:43 +00:00
|
|
|
uiBut *uiDefIconButR(uiBlock *block, int type, int retval, int icon, short x1, short y1, short x2, short y2, struct PointerRNA *ptr, const char *propname, int index, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefIconButO(struct bContext *C, uiBlock *block, int type, char *opname, int opcontext, int icon, short x1, short y1, short x2, short y2, char *tip);
|
|
|
|
|
|
|
|
uiBut *uiDefIconTextBut(uiBlock *block,
|
|
|
|
int type, int retval, int icon, char *str,
|
|
|
|
short x1, short y1,
|
|
|
|
short x2, short y2,
|
|
|
|
void *poin,
|
|
|
|
float min, float max,
|
|
|
|
float a1, float a2, char *tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
uiBut *uiDefIconTextButF(uiBlock *block, int type, int retval, int icon, char *str, short x1, short y1, short x2, short y2, float *poin, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefIconTextButBitF(uiBlock *block, int type, int bit, int retval, int icon, char *str, short x1, short y1, short x2, short y2, float *poin, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefIconTextButI(uiBlock *block, int type, int retval, int icon, char *str, short x1, short y1, short x2, short y2, int *poin, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefIconTextButBitI(uiBlock *block, int type, int bit, int retval, int icon, char *str, short x1, short y1, short x2, short y2, int *poin, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefIconTextButS(uiBlock *block, int type, int retval, int icon, char *str, short x1, short y1, short x2, short y2, short *poin, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefIconTextButBitS(uiBlock *block, int type, int bit, int retval, int icon, char *str, short x1, short y1, short x2, short y2, short *poin, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefIconTextButC(uiBlock *block, int type, int retval, int icon, char *str, short x1, short y1, short x2, short y2, char *poin, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefIconTextButBitC(uiBlock *block, int type, int bit, int retval, int icon, char *str, short x1, short y1, short x2, short y2, char *poin, float min, float max, float a1, float a2, char *tip);
|
2008-12-16 07:55:43 +00:00
|
|
|
uiBut *uiDefIconTextButR(uiBlock *block, int type, int retval, int icon, char *str, short x1, short y1, short x2, short y2, struct PointerRNA *ptr, const char *propname, int index, float min, float max, float a1, float a2, char *tip);
|
|
|
|
uiBut *uiDefIconTextButO(struct bContext *C, uiBlock *block, int type, char *opname, int opcontext, int icon, char *str, short x1, short y1, short x2, short y2, char *tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2008-12-10 19:22:10 +00:00
|
|
|
typedef void (*uiIDPoinFuncFP) (struct bContext *C, char *str, struct ID **idpp);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
uiBut *uiDefIDPoinBut(struct uiBlock *block, uiIDPoinFuncFP func, short blocktype, int retval, char *str,
|
|
|
|
short x1, short y1, short x2, short y2, void *idpp, char *tip);
|
|
|
|
|
|
|
|
uiBut *uiDefBlockBut(uiBlock *block, uiBlockFuncFP func, void *func_arg1, char *str, short x1, short y1, short x2, short y2, char *tip);
|
|
|
|
uiBut *uiDefPulldownBut(uiBlock *block, uiBlockFuncFP func, void *func_arg1, char *str, short x1, short y1, short x2, short y2, char *tip);
|
|
|
|
|
|
|
|
uiBut *uiDefIconTextBlockBut(uiBlock *block, uiBlockFuncFP func, void *arg, int icon, char *str, short x1, short y1, short x2, short y2, char *tip);
|
|
|
|
uiBut *uiDefIconBlockBut(uiBlock *block, uiBlockFuncFP func, void *arg, int retval, int icon, short x1, short y1, short x2, short y2, char *tip);
|
|
|
|
|
|
|
|
void uiDefKeyevtButS(uiBlock *block, int retval, char *str, short x1, short y1, short x2, short y2, short *spoin, char *tip);
|
|
|
|
|
|
|
|
void uiAutoBlock(struct uiBlock *block,
|
|
|
|
float minx, float miny,
|
|
|
|
float sizex, float sizey, int flag);
|
|
|
|
void uiSetButLink(struct uiBut *but,
|
|
|
|
void **poin,
|
|
|
|
void ***ppoin,
|
|
|
|
short *tot,
|
|
|
|
int from, int to);
|
|
|
|
|
|
|
|
int uiBlocksGetYMin (ListBase *lb);
|
|
|
|
int uiBlockGetCol (uiBlock *block);
|
|
|
|
void* uiBlockGetCurFont (uiBlock *block);
|
|
|
|
|
|
|
|
void uiBlockSetCol (uiBlock *block, int col);
|
|
|
|
void uiBlockSetEmboss (uiBlock *block, int emboss);
|
|
|
|
void uiBlockSetDirection (uiBlock *block, int direction);
|
|
|
|
void uiBlockFlipOrder (uiBlock *block);
|
|
|
|
void uiBlockSetFlag (uiBlock *block, int flag);
|
|
|
|
void uiBlockSetXOfs (uiBlock *block, int xofs);
|
|
|
|
|
|
|
|
int uiButGetRetVal (uiBut *but);
|
|
|
|
|
|
|
|
void uiButSetFlag (uiBut *but, int flag);
|
|
|
|
void uiButClearFlag (uiBut *but, int flag);
|
|
|
|
|
2008-12-16 20:03:28 +00:00
|
|
|
struct PointerRNA *uiButGetOperatorPtrRNA(uiBut *but);
|
|
|
|
|
2008-12-10 19:22:10 +00:00
|
|
|
void uiBlockSetHandleFunc(uiBlock *block, void (*func)(struct bContext *C, void *arg, int event), void *arg);
|
|
|
|
void uiBlockSetButmFunc (uiBlock *block, void (*func)(struct bContext *C, void *arg, int but_a2), void *arg);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2008-12-10 19:22:10 +00:00
|
|
|
void uiBlockSetFunc (uiBlock *block, void (*func)(struct bContext *C, void *arg1, void *arg2), void *arg1, void *arg2);
|
|
|
|
void uiButSetFunc (uiBut *but, void (*func)(struct bContext *C, void *arg1, void *arg2), void *arg1, void *arg2);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2008-12-10 19:22:10 +00:00
|
|
|
void uiButSetCompleteFunc(uiBut *but, void (*func)(struct bContext *C, char *str, void *arg), void *arg);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2008-12-10 19:22:10 +00:00
|
|
|
void uiBlockSetDrawExtraFunc(uiBlock *block, void (*func)(struct bContext *C, uiBlock *block));
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
/* Panels */
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
extern void uiFreePanels(struct ListBase *lb);
|
|
|
|
extern void uiNewPanelTabbed(char *, char *);
|
2008-12-26 13:11:04 +00:00
|
|
|
extern int uiNewPanel(const struct bContext *C, struct ARegion *ar, uiBlock *block, char *panelname, char *tabname, int ofsx, int ofsy, int sizex, int sizey);
|
|
|
|
extern void uiDrawPanels(const struct bContext *C, int re_align);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2008-12-26 13:11:04 +00:00
|
|
|
extern void uiSetPanelsView2d(struct ARegion *ar);
|
|
|
|
extern void uiMatchPanelsView2d(struct ARegion *ar);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
extern void uiNewPanelHeight(struct uiBlock *block, int sizey);
|
|
|
|
extern void uiNewPanelTitle(struct uiBlock *block, char *str);
|
|
|
|
extern uiBlock *uiFindOpenPanelBlockName(ListBase *lb, char *name);
|
2008-12-26 13:11:04 +00:00
|
|
|
extern int uiAlignPanelStep(struct ScrArea *sa, struct ARegion *ar, float fac);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
extern void uiPanelControl(int);
|
|
|
|
extern void uiSetPanelHandler(int);
|
|
|
|
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
/* Autocomplete */
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
typedef struct AutoComplete AutoComplete;
|
|
|
|
|
|
|
|
AutoComplete *autocomplete_begin(char *startname, int maxlen);
|
|
|
|
void autocomplete_do_name(AutoComplete *autocpl, const char *name);
|
|
|
|
void autocomplete_end(AutoComplete *autocpl, char *autoname);
|
|
|
|
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
/* Handlers for regions with UI blocks */
|
|
|
|
|
|
|
|
void UI_add_region_handlers(struct ListBase *handlers);
|
2008-12-26 13:11:04 +00:00
|
|
|
void UI_add_area_handlers(struct ListBase *handlers);
|
2008-12-15 19:19:39 +00:00
|
|
|
void UI_add_popup_handlers(struct ListBase *handlers, uiMenuBlockHandle *menu);
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
|
|
|
|
/* Module initialization and exit */
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
void UI_init(void);
|
|
|
|
void UI_init_userdef(void);
|
|
|
|
void UI_exit(void);
|
|
|
|
|
|
|
|
#endif /* UI_INTERFACE_H */
|
|
|
|
|