2008-12-16 13:19:02 +00:00
/**
* * * * * * BEGIN GPL LICENSE BLOCK * * * * *
*
* This program is free software ; you can redistribute it and / or
* modify it under the terms of the GNU General Public License
* as published by the Free Software Foundation ; either version 2
* of the License , or ( at your option ) any later version .
*
* This program is distributed in the hope that it will be useful ,
* but WITHOUT ANY WARRANTY ; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the
* GNU General Public License for more details .
*
* You should have received a copy of the GNU General Public License
* along with this program ; if not , write to the Free Software Foundation ,
* Inc . , 59 Temple Place - Suite 330 , Boston , MA 02111 - 1307 , USA .
*
* The Original Code is Copyright ( C ) 2008 Blender Foundation .
* All rights reserved .
*
* Contributor ( s ) : Blender Foundation
*
* * * * * * END GPL LICENSE BLOCK * * * * *
*/
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
2008-12-15 19:19:39 +00:00
# include <stdarg.h>
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
# include <stdlib.h>
# include <string.h>
# include "MEM_guardedalloc.h"
# include "DNA_screen_types.h"
2008-12-03 17:11:50 +00:00
# include "DNA_view2d_types.h"
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
# include "DNA_windowmanager_types.h"
# include "BLI_arithb.h"
# include "BLI_blenlib.h"
2008-12-29 13:38:08 +00:00
# include "BLI_dynstr.h"
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
2008-12-18 02:56:48 +00:00
# include "BKE_context.h"
2008-12-29 13:38:08 +00:00
# include "BKE_report.h"
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
# include "BKE_screen.h"
# include "BKE_utildefines.h"
# include "WM_api.h"
# include "WM_types.h"
2.5: WM Compositing
* Triple Buffer is now more complete:
- Proper handling of window resize, duplicate, etc.
- It now uses 3x3 textures (or less) if the power of two sizes
do not match well. That still has a worst case wast of 23.4%,
but better than 300%.
- It can also use the ARB/NV/EXT_texture_rectangle extension
now, which may be supported on hardware that does not support
ARB_texture_non_power_of_two.
- Gesture, menu and brushe redraws now require no redraws at all
from the area regions. So even on a high poly scene just moving
the paint cursor or opening a menu should be fast.
* Testing can be done by setting the "Window Draw Method" in the
User Preferences in the outliner. "Overlap" is still default,
since "Triple Buffer" has not been tested on computers other than
mine, would like to avoid crashing Blender on startup in case
there is a common bug, but it's ready for testing now.
- For reference "Full" draws the full window each time.
- "Triple Buffer" should work for both swap copy and swap exchange
systems, the latter still need the -E command line option for
"Overlap".
- Resizing and going fullscreen still gives flicker here but no
more than "Full" drawing.
* Partial Redraw was added. ED_region_tag_redraw_partial takes a
rect in window coordinates to define a subarea of the region.
On region draw it will then set glScissor to a smaller area, and
ar->drawrct will always be set to either the partial or full
window rect. The latter can then be used for clipping in the 3D
view or clipping interface drawing. Neither is implemented yet.
2009-01-23 03:52:52 +00:00
# include "wm_draw.h"
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
# include "wm_subwindow.h"
# include "wm_window.h"
2009-01-25 20:22:05 +00:00
# include "RNA_access.h"
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
# include "BIF_gl.h"
# include "UI_interface.h"
2008-12-03 17:11:50 +00:00
# include "UI_text.h"
# include "UI_view2d.h"
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
# include "ED_screen.h"
2008-12-26 13:11:04 +00:00
# include "interface_intern.h"
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
# define MENU_BUTTON_HEIGHT 20
2009-01-04 00:05:40 +00:00
# define MENU_SEPR_HEIGHT 6
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
# define B_NOP -1
# define MENU_SHADOW_LEFT -1
# define MENU_SHADOW_BOTTOM -10
# define MENU_SHADOW_RIGHT 10
# define MENU_SHADOW_TOP 1
2009-01-04 00:05:40 +00:00
# define MENU_ROUNDED_TOP 5
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
/*********************** Menu Data Parsing ********************* */
typedef struct {
char * str ;
int retval ;
int icon ;
} MenuEntry ;
typedef struct {
char * instr ;
char * title ;
int titleicon ;
MenuEntry * items ;
int nitems , itemssize ;
} MenuData ;
static MenuData * menudata_new ( char * instr )
{
MenuData * md = MEM_mallocN ( sizeof ( * md ) , " MenuData " ) ;
md - > instr = instr ;
md - > title = NULL ;
md - > titleicon = 0 ;
md - > items = NULL ;
md - > nitems = md - > itemssize = 0 ;
return md ;
}
static void menudata_set_title ( MenuData * md , char * title , int titleicon )
{
if ( ! md - > title )
md - > title = title ;
if ( ! md - > titleicon )
md - > titleicon = titleicon ;
}
static void menudata_add_item ( MenuData * md , char * str , int retval , int icon )
{
if ( md - > nitems = = md - > itemssize ) {
int nsize = md - > itemssize ? ( md - > itemssize < < 1 ) : 1 ;
MenuEntry * oitems = md - > items ;
md - > items = MEM_mallocN ( nsize * sizeof ( * md - > items ) , " md->items " ) ;
if ( oitems ) {
memcpy ( md - > items , oitems , md - > nitems * sizeof ( * md - > items ) ) ;
MEM_freeN ( oitems ) ;
}
md - > itemssize = nsize ;
}
md - > items [ md - > nitems ] . str = str ;
md - > items [ md - > nitems ] . retval = retval ;
md - > items [ md - > nitems ] . icon = icon ;
md - > nitems + + ;
}
void menudata_free ( MenuData * md )
{
MEM_freeN ( md - > instr ) ;
if ( md - > items )
MEM_freeN ( md - > items ) ;
MEM_freeN ( md ) ;
}
/**
* Parse menu description strings , string is of the
* form " [sss%t|]{(sss[%xNN]|), (%l|)} " , ssss % t indicates the
* menu title , sss or sss % xNN indicates an option ,
* if % xNN is given then NN is the return value if
* that option is selected otherwise the return value
* is the index of the option ( starting with 1 ) . % l
* indicates a seperator .
*
* @ param str String to be parsed .
* @ retval new menudata structure , free with menudata_free ( )
*/
MenuData * decompose_menu_string ( char * str )
{
char * instr = BLI_strdup ( str ) ;
MenuData * md = menudata_new ( instr ) ;
char * nitem = NULL , * s = instr ;
int nicon = 0 , nretval = 1 , nitem_is_title = 0 ;
while ( 1 ) {
char c = * s ;
if ( c = = ' % ' ) {
if ( s [ 1 ] = = ' x ' ) {
nretval = atoi ( s + 2 ) ;
* s = ' \0 ' ;
s + + ;
} else if ( s [ 1 ] = = ' t ' ) {
nitem_is_title = 1 ;
* s = ' \0 ' ;
s + + ;
} else if ( s [ 1 ] = = ' l ' ) {
nitem = " %l " ;
s + + ;
} else if ( s [ 1 ] = = ' i ' ) {
nicon = atoi ( s + 2 ) ;
* s = ' \0 ' ;
s + + ;
}
} else if ( c = = ' | ' | | c = = ' \0 ' ) {
if ( nitem ) {
* s = ' \0 ' ;
if ( nitem_is_title ) {
menudata_set_title ( md , nitem , nicon ) ;
nitem_is_title = 0 ;
} else {
/* prevent separator to get a value */
if ( nitem [ 0 ] = = ' % ' & & nitem [ 1 ] = = ' l ' )
menudata_add_item ( md , nitem , - 1 , nicon ) ;
else
menudata_add_item ( md , nitem , nretval , nicon ) ;
nretval = md - > nitems + 1 ;
}
nitem = NULL ;
nicon = 0 ;
}
if ( c = = ' \0 ' )
break ;
} else if ( ! nitem )
nitem = s ;
s + + ;
}
return md ;
}
void ui_set_name_menu ( uiBut * but , int value )
{
MenuData * md ;
int i ;
md = decompose_menu_string ( but - > str ) ;
for ( i = 0 ; i < md - > nitems ; i + + )
if ( md - > items [ i ] . retval = = value )
strcpy ( but - > drawstr , md - > items [ i ] . str ) ;
menudata_free ( md ) ;
}
/******************** Creating Temporary regions ******************/
ARegion * ui_add_temporary_region ( bScreen * sc )
{
ARegion * ar ;
ar = MEM_callocN ( sizeof ( ARegion ) , " area region " ) ;
BLI_addtail ( & sc - > regionbase , ar ) ;
ar - > regiontype = RGN_TYPE_TEMPORARY ;
ar - > alignment = RGN_ALIGN_FLOAT ;
return ar ;
}
void ui_remove_temporary_region ( bContext * C , bScreen * sc , ARegion * ar )
{
2.5: WM Compositing
* Triple Buffer is now more complete:
- Proper handling of window resize, duplicate, etc.
- It now uses 3x3 textures (or less) if the power of two sizes
do not match well. That still has a worst case wast of 23.4%,
but better than 300%.
- It can also use the ARB/NV/EXT_texture_rectangle extension
now, which may be supported on hardware that does not support
ARB_texture_non_power_of_two.
- Gesture, menu and brushe redraws now require no redraws at all
from the area regions. So even on a high poly scene just moving
the paint cursor or opening a menu should be fast.
* Testing can be done by setting the "Window Draw Method" in the
User Preferences in the outliner. "Overlap" is still default,
since "Triple Buffer" has not been tested on computers other than
mine, would like to avoid crashing Blender on startup in case
there is a common bug, but it's ready for testing now.
- For reference "Full" draws the full window each time.
- "Triple Buffer" should work for both swap copy and swap exchange
systems, the latter still need the -E command line option for
"Overlap".
- Resizing and going fullscreen still gives flicker here but no
more than "Full" drawing.
* Partial Redraw was added. ED_region_tag_redraw_partial takes a
rect in window coordinates to define a subarea of the region.
On region draw it will then set glScissor to a smaller area, and
ar->drawrct will always be set to either the partial or full
window rect. The latter can then be used for clipping in the 3D
view or clipping interface drawing. Neither is implemented yet.
2009-01-23 03:52:52 +00:00
wm_draw_region_clear ( CTX_wm_window ( C ) , ar ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
ED_region_exit ( C , ar ) ;
2.5
View3D has been split now in a local part (RegionView3D) and a
per-area part (old View3D). Currently local is:
- view transform
- camera zoom/offset
- gpencil (todo)
- custom clipping planes
Rest is in Area still, like active camera, draw type, layers,
localview, custom centers, around-settings, transform widget,
gridlines, and so on (mostly stuff as available in header).
To see it work; also added new feature for region split,
press SHIFT+ALT+CTRL+S for four-split.
The idea is to make a preset 4-split, configured to stick
to top/right/front views for three views.
Another cool idea to explore is to then box-clip all drawing
based on these 3 views.
Note about the code:
- currently view3d still stores some depricated settings, to
convert from older files. Not all settings are copied over
though, like custom clip planes or the 'lock view to object'.
- since some view3d ops are now on area level, the operators
for it should keep track of that.
Bugfix in transform: quat initialize in operator-invoke missed
one zero.
Als brought back GE to compile for missing Ipos and channels.
2009-01-19 16:54:41 +00:00
BKE_area_region_free ( NULL , ar ) ; /* NULL: no spacetype */
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
BLI_freelinkN ( & sc - > regionbase , ar ) ;
}
/************************* Creating Tooltips **********************/
typedef struct uiTooltipData {
rctf bbox ;
struct BMF_Font * font ;
char * tip ;
float aspect ;
} uiTooltipData ;
static void ui_tooltip_region_draw ( const bContext * C , ARegion * ar )
{
uiTooltipData * data ;
int x1 , y1 , x2 , y2 ;
data = ar - > regiondata ;
x1 = ar - > winrct . xmin ;
y1 = ar - > winrct . ymin ;
x2 = ar - > winrct . xmax ;
y2 = ar - > winrct . ymax ;
/* draw drop shadow */
glBlendFunc ( GL_SRC_ALPHA , GL_ONE_MINUS_SRC_ALPHA ) ;
glEnable ( GL_BLEND ) ;
glColor4ub ( 0 , 0 , 0 , 20 ) ;
gl_round_box ( GL_POLYGON , 3 , 3 , x2 - x1 - 3 , y2 - y1 - 2 , 2.0 ) ;
gl_round_box ( GL_POLYGON , 3 , 2 , x2 - x1 - 2 , y2 - y1 - 2 , 3.0 ) ;
glColor4ub ( 0 , 0 , 0 , 8 ) ;
gl_round_box ( GL_POLYGON , 3 , 1 , x2 - x1 - 1 , y2 - y1 - 3 , 4.0 ) ;
gl_round_box ( GL_POLYGON , 3 , 0 , x2 - x1 - 0 , y2 - y1 - 3 , 5.0 ) ;
glDisable ( GL_BLEND ) ;
/* draw background */
glColor3f ( 1.0f , 1.0f , 0.8666f ) ;
2008-12-03 23:21:01 +00:00
glRectf ( 0 , 4 , x2 - x1 + 4 , y2 - y1 ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
/* draw text */
glColor3ub ( 0 , 0 , 0 ) ;
/* set the position for drawing text +4 in from the left edge, and leaving
* an equal gap between the top of the background box and the top of the
* string ' s bbox , and the bottom of the background box , and the bottom of
* the string ' s bbox */
ui_rasterpos_safe ( 4 , ( ( y2 - data - > bbox . ymax ) + ( y1 + data - > bbox . ymin ) ) / 2 - data - > bbox . ymin - y1 , data - > aspect ) ;
UI_SetScale ( 1.0 ) ;
UI_DrawString ( data - > font , data - > tip , ui_translate_tooltips ( ) ) ;
}
static void ui_tooltip_region_free ( ARegion * ar )
{
uiTooltipData * data ;
data = ar - > regiondata ;
MEM_freeN ( data - > tip ) ;
MEM_freeN ( data ) ;
2.5
Added WM Jobs manager
- WM can manage threaded jobs for you; just provide a couple
of components to get it work:
- customdata, free callback for it
- timer step, notifier code
- start callback, update callback
- Once started, each job runs an own timer, and will for
every time step check necessary updates, or close the
job when ready.
- No drawing happens in jobs, that's for notifiers!
- Every job stores an owner pointer, and based on this owner
it will prevent multiple jobs to enter the stack.
Instead it will re-use a running job, signal it to stop
and allow caller to re-initialize it even.
- Check new wm_jobs.c for more explanation. Jobs API is still
under construction.
Fun: BLI_addtail(&wm->jobs, steve); :)
Put Node shader previews back using wmJobs
- Preview calculating is now fully threaded (1 thread still)
- Thanks to new event system + notifiers, you can see
previews update even while dragging sliders!
- Currently it only starts when you change a node setting.
Warning: the thread render shares Node data, so don't delete
nodes while it renders! This topic is on the todo to make safe.
Also:
- bug in region initialize (do_versions) showed channel list in
node editor wrong.
- flagged the channel list 'hidden' now, it was really in the
way! This is for later to work on anyway.
- recoded Render API callbacks so it gets handlers passed on,
no globals to use anymore, remember?
- previewrender code gets now so much nicer! Will remove a lot
of stuff from code soon.
2009-01-22 14:59:49 +00:00
ar - > regiondata = NULL ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
}
ARegion * ui_tooltip_create ( bContext * C , ARegion * butregion , uiBut * but )
{
2008-12-08 15:02:57 +00:00
static ARegionType type ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
ARegion * ar ;
uiTooltipData * data ;
2008-12-26 13:11:04 +00:00
int x1 , x2 , y1 , y2 , winx , winy , ofsx , ofsy ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
if ( ! but - > tip | | strlen ( but - > tip ) = = 0 )
return NULL ;
/* create area region */
2008-12-18 02:56:48 +00:00
ar = ui_add_temporary_region ( CTX_wm_screen ( C ) ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
2008-12-08 15:02:57 +00:00
memset ( & type , 0 , sizeof ( ARegionType ) ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
type . draw = ui_tooltip_region_draw ;
type . free = ui_tooltip_region_free ;
ar - > type = & type ;
/* create tooltip data */
data = MEM_callocN ( sizeof ( uiTooltipData ) , " uiTooltipData " ) ;
data - > tip = BLI_strdup ( but - > tip ) ;
data - > font = but - > font ;
data - > aspect = but - > aspect ;
UI_GetBoundingBox ( data - > font , data - > tip , ui_translate_tooltips ( ) , & data - > bbox ) ;
ar - > regiondata = data ;
/* compute position */
2008-12-26 13:11:04 +00:00
ofsx = ( but - > block - > panel ) ? but - > block - > panel - > ofsx : 0 ;
ofsy = ( but - > block - > panel ) ? but - > block - > panel - > ofsy : 0 ;
x1 = ( but - > x1 + but - > x2 ) / 2 + ofsx ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
x2 = x1 + but - > aspect * ( ( data - > bbox . xmax - data - > bbox . xmin ) + 8 ) ;
2008-12-26 13:11:04 +00:00
y2 = but - > y1 - 10 + ofsy ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
y1 = y2 - but - > aspect * ( ( data - > bbox . ymax + ( data - > bbox . ymax - data - > bbox . ymin ) ) ) ;
2008-12-03 23:21:01 +00:00
y2 + = 8 ;
x2 + = 8 ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
if ( butregion ) {
2008-12-03 17:11:50 +00:00
/* XXX temp, region v2ds can be empty still */
if ( butregion - > v2d . cur . xmin ! = butregion - > v2d . cur . xmax ) {
2009-01-15 04:13:38 +00:00
UI_view2d_to_region_no_clip ( & butregion - > v2d , x1 , y1 , & x1 , & y1 ) ;
UI_view2d_to_region_no_clip ( & butregion - > v2d , x2 , y2 , & x2 , & y2 ) ;
2008-12-03 17:11:50 +00:00
}
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
x1 + = butregion - > winrct . xmin ;
x2 + = butregion - > winrct . xmin ;
y1 + = butregion - > winrct . ymin ;
y2 + = butregion - > winrct . ymin ;
}
2008-12-18 02:56:48 +00:00
wm_window_get_size ( CTX_wm_window ( C ) , & winx , & winy ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
if ( x2 > winx ) {
2008-12-03 17:11:50 +00:00
/* super size */
if ( x2 > winx + x1 ) {
x2 = winx ;
x1 = 0 ;
}
else {
x1 - = x2 - winx ;
x2 = winx ;
}
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
}
if ( y1 < 0 ) {
y1 + = 36 ;
y2 + = 36 ;
}
2008-12-03 17:11:50 +00:00
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
ar - > winrct . xmin = x1 ;
ar - > winrct . ymin = y1 ;
ar - > winrct . xmax = x2 ;
ar - > winrct . ymax = y2 ;
2008-12-10 13:56:54 +00:00
/* adds subwindow */
ED_region_init ( C , ar ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
/* notify change and redraw */
2008-12-16 12:28:00 +00:00
ED_region_tag_redraw ( ar ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
return ar ;
}
void ui_tooltip_free ( bContext * C , ARegion * ar )
{
2008-12-18 02:56:48 +00:00
ui_remove_temporary_region ( C , CTX_wm_screen ( C ) , ar ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
}
/************************* Creating Menu Blocks **********************/
/* position block relative to but, result is in window space */
static void ui_block_position ( wmWindow * window , ARegion * butregion , uiBut * but , uiBlock * block )
{
uiBut * bt ;
uiSafetyRct * saferct ;
rctf butrct ;
float aspect ;
int xsize , ysize , xof = 0 , yof = 0 , center ;
short dir1 = 0 , dir2 = 0 ;
/* transform to window coordinates, using the source button region/block */
butrct . xmin = but - > x1 ; butrct . xmax = but - > x2 ;
butrct . ymin = but - > y1 ; butrct . ymax = but - > y2 ;
ui_block_to_window_fl ( butregion , but - > block , & butrct . xmin , & butrct . ymin ) ;
ui_block_to_window_fl ( butregion , but - > block , & butrct . xmax , & butrct . ymax ) ;
/* calc block rect */
if ( block - > minx = = 0.0f & & block - > maxx = = 0.0f ) {
if ( block - > buttons . first ) {
block - > minx = block - > miny = 10000 ;
block - > maxx = block - > maxy = - 10000 ;
bt = block - > buttons . first ;
while ( bt ) {
if ( bt - > x1 < block - > minx ) block - > minx = bt - > x1 ;
if ( bt - > y1 < block - > miny ) block - > miny = bt - > y1 ;
if ( bt - > x2 > block - > maxx ) block - > maxx = bt - > x2 ;
if ( bt - > y2 > block - > maxy ) block - > maxy = bt - > y2 ;
bt = bt - > next ;
}
}
else {
/* we're nice and allow empty blocks too */
block - > minx = block - > miny = 0 ;
block - > maxx = block - > maxy = 20 ;
}
}
aspect = ( float ) ( block - > maxx - block - > minx + 4 ) ;
ui_block_to_window_fl ( butregion , but - > block , & block - > minx , & block - > miny ) ;
ui_block_to_window_fl ( butregion , but - > block , & block - > maxx , & block - > maxy ) ;
//block->minx-= 2.0; block->miny-= 2.0;
//block->maxx+= 2.0; block->maxy+= 2.0;
xsize = block - > maxx - block - > minx + 4 ; // 4 for shadow
ysize = block - > maxy - block - > miny + 4 ;
aspect / = ( float ) xsize ;
if ( but ) {
int left = 0 , right = 0 , top = 0 , down = 0 ;
int winx , winy ;
wm_window_get_size ( window , & winx , & winy ) ;
if ( block - > direction & UI_CENTER ) center = ysize / 2 ;
else center = 0 ;
if ( butrct . xmin - xsize > 0.0 ) left = 1 ;
if ( butrct . xmax + xsize < winx ) right = 1 ;
if ( butrct . ymin - ysize + center > 0.0 ) down = 1 ;
if ( butrct . ymax + ysize - center < winy ) top = 1 ;
dir1 = block - > direction & UI_DIRECTION ;
/* secundary directions */
if ( dir1 & ( UI_TOP | UI_DOWN ) ) {
if ( dir1 & UI_LEFT ) dir2 = UI_LEFT ;
else if ( dir1 & UI_RIGHT ) dir2 = UI_RIGHT ;
dir1 & = ( UI_TOP | UI_DOWN ) ;
}
if ( dir2 = = 0 ) if ( dir1 = = UI_LEFT | | dir1 = = UI_RIGHT ) dir2 = UI_DOWN ;
if ( dir2 = = 0 ) if ( dir1 = = UI_TOP | | dir1 = = UI_DOWN ) dir2 = UI_LEFT ;
/* no space at all? dont change */
if ( left | | right ) {
if ( dir1 = = UI_LEFT & & left = = 0 ) dir1 = UI_RIGHT ;
if ( dir1 = = UI_RIGHT & & right = = 0 ) dir1 = UI_LEFT ;
/* this is aligning, not append! */
if ( dir2 = = UI_LEFT & & right = = 0 ) dir2 = UI_RIGHT ;
if ( dir2 = = UI_RIGHT & & left = = 0 ) dir2 = UI_LEFT ;
}
if ( down | | top ) {
if ( dir1 = = UI_TOP & & top = = 0 ) dir1 = UI_DOWN ;
if ( dir1 = = UI_DOWN & & down = = 0 ) dir1 = UI_TOP ;
if ( dir2 = = UI_TOP & & top = = 0 ) dir2 = UI_DOWN ;
if ( dir2 = = UI_DOWN & & down = = 0 ) dir2 = UI_TOP ;
}
if ( dir1 = = UI_LEFT ) {
xof = butrct . xmin - block - > maxx ;
if ( dir2 = = UI_TOP ) yof = butrct . ymin - block - > miny - center ;
else yof = butrct . ymax - block - > maxy + center ;
}
else if ( dir1 = = UI_RIGHT ) {
xof = butrct . xmax - block - > minx ;
if ( dir2 = = UI_TOP ) yof = butrct . ymin - block - > miny - center ;
else yof = butrct . ymax - block - > maxy + center ;
}
else if ( dir1 = = UI_TOP ) {
yof = butrct . ymax - block - > miny ;
if ( dir2 = = UI_RIGHT ) xof = butrct . xmax - block - > maxx ;
else xof = butrct . xmin - block - > minx ;
// changed direction?
if ( ( dir1 & block - > direction ) = = 0 ) {
if ( block - > direction & UI_SHIFT_FLIPPED )
xof + = dir2 = = UI_LEFT ? 25 : - 25 ;
uiBlockFlipOrder ( block ) ;
}
}
else if ( dir1 = = UI_DOWN ) {
yof = butrct . ymin - block - > maxy ;
if ( dir2 = = UI_RIGHT ) xof = butrct . xmax - block - > maxx ;
else xof = butrct . xmin - block - > minx ;
// changed direction?
if ( ( dir1 & block - > direction ) = = 0 ) {
if ( block - > direction & UI_SHIFT_FLIPPED )
xof + = dir2 = = UI_LEFT ? 25 : - 25 ;
uiBlockFlipOrder ( block ) ;
}
}
/* and now we handle the exception; no space below or to top */
if ( top = = 0 & & down = = 0 ) {
if ( dir1 = = UI_LEFT | | dir1 = = UI_RIGHT ) {
// align with bottom of screen
yof = ysize ;
}
}
/* or no space left or right */
if ( left = = 0 & & right = = 0 ) {
if ( dir1 = = UI_TOP | | dir1 = = UI_DOWN ) {
// align with left size of screen
xof = - block - > minx + 5 ;
}
}
// apply requested offset in the block
xof + = block - > xofs / block - > aspect ;
yof + = block - > yofs / block - > aspect ;
}
/* apply */
for ( bt = block - > buttons . first ; bt ; bt = bt - > next ) {
ui_block_to_window_fl ( butregion , but - > block , & bt - > x1 , & bt - > y1 ) ;
ui_block_to_window_fl ( butregion , but - > block , & bt - > x2 , & bt - > y2 ) ;
bt - > x1 + = xof ;
bt - > x2 + = xof ;
bt - > y1 + = yof ;
bt - > y2 + = yof ;
bt - > aspect = 1.0 ;
// ui_check_but recalculates drawstring size in pixels
ui_check_but ( bt ) ;
}
block - > minx + = xof ;
block - > miny + = yof ;
block - > maxx + = xof ;
block - > maxy + = yof ;
/* safety calculus */
if ( but ) {
float midx = ( butrct . xmin + butrct . xmax ) / 2.0 ;
float midy = ( butrct . ymin + butrct . ymax ) / 2.0 ;
/* when you are outside parent button, safety there should be smaller */
// parent button to left
if ( midx < block - > minx ) block - > safety . xmin = block - > minx - 3 ;
else block - > safety . xmin = block - > minx - 40 ;
// parent button to right
if ( midx > block - > maxx ) block - > safety . xmax = block - > maxx + 3 ;
else block - > safety . xmax = block - > maxx + 40 ;
// parent button on bottom
if ( midy < block - > miny ) block - > safety . ymin = block - > miny - 3 ;
else block - > safety . ymin = block - > miny - 40 ;
// parent button on top
if ( midy > block - > maxy ) block - > safety . ymax = block - > maxy + 3 ;
else block - > safety . ymax = block - > maxy + 40 ;
// exception for switched pulldowns...
if ( dir1 & & ( dir1 & block - > direction ) = = 0 ) {
if ( dir2 = = UI_RIGHT ) block - > safety . xmax = block - > maxx + 3 ;
if ( dir2 = = UI_LEFT ) block - > safety . xmin = block - > minx - 3 ;
}
block - > direction = dir1 ;
}
else {
block - > safety . xmin = block - > minx - 40 ;
block - > safety . ymin = block - > miny - 40 ;
block - > safety . xmax = block - > maxx + 40 ;
block - > safety . ymax = block - > maxy + 40 ;
}
/* keep a list of these, needed for pulldown menus */
saferct = MEM_callocN ( sizeof ( uiSafetyRct ) , " uiSafetyRct " ) ;
saferct - > parent = butrct ;
saferct - > safety = block - > safety ;
BLI_freelistN ( & block - > saferct ) ;
if ( but )
BLI_duplicatelist ( & block - > saferct , & but - > block - > saferct ) ;
BLI_addhead ( & block - > saferct , saferct ) ;
}
static void ui_block_region_draw ( const bContext * C , ARegion * ar )
{
uiBlock * block ;
2008-12-15 19:19:39 +00:00
for ( block = ar - > uiblocks . first ; block ; block = block - > next )
2008-12-26 13:11:04 +00:00
uiDrawBlock ( C , block ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiPopupBlockHandle * ui_popup_block_create ( bContext * C , ARegion * butregion , uiBut * but , uiBlockCreateFunc create_func , uiBlockHandleCreateFunc handle_create_func , void * arg )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
2008-12-18 02:56:48 +00:00
wmWindow * window = CTX_wm_window ( C ) ;
2008-12-08 15:02:57 +00:00
static ARegionType type ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
ARegion * ar ;
uiBlock * block ;
uiBut * bt ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiPopupBlockHandle * handle ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
uiSafetyRct * saferct ;
/* create handle */
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
handle = MEM_callocN ( sizeof ( uiPopupBlockHandle ) , " uiPopupBlockHandle " ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
2009-02-07 16:43:55 +00:00
/* store context for operator */
handle - > ctx_area = CTX_wm_area ( C ) ;
handle - > ctx_region = CTX_wm_region ( C ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
/* create area region */
2008-12-18 02:56:48 +00:00
ar = ui_add_temporary_region ( CTX_wm_screen ( C ) ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
2008-12-08 15:02:57 +00:00
memset ( & type , 0 , sizeof ( ARegionType ) ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
type . draw = ui_block_region_draw ;
ar - > type = & type ;
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
UI_add_region_handlers ( & ar - > handlers ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
handle - > region = ar ;
ar - > regiondata = handle ;
/* create ui block */
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
if ( create_func )
block = create_func ( C , handle - > region , arg ) ;
else
block = handle_create_func ( C , handle , arg ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
block - > handle = handle ;
2009-01-28 23:29:27 +00:00
if ( ! block - > endblock )
uiEndBlock ( C , block ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
/* if this is being created from a button */
if ( but ) {
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
if ( ELEM3 ( but - > type , BLOCK , PULLDOWN , HMENU ) )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
block - > xofs = - 2 ; /* for proper alignment */
/* only used for automatic toolbox, so can set the shift flag */
if ( but - > flag & UI_MAKE_TOP ) {
block - > direction = UI_TOP | UI_SHIFT_FLIPPED ;
uiBlockFlipOrder ( block ) ;
}
if ( but - > flag & UI_MAKE_DOWN ) block - > direction = UI_DOWN | UI_SHIFT_FLIPPED ;
if ( but - > flag & UI_MAKE_LEFT ) block - > direction | = UI_LEFT ;
if ( but - > flag & UI_MAKE_RIGHT ) block - > direction | = UI_RIGHT ;
2008-12-18 02:56:48 +00:00
ui_block_position ( window , butregion , but , block ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
}
else {
/* keep a list of these, needed for pulldown menus */
saferct = MEM_callocN ( sizeof ( uiSafetyRct ) , " uiSafetyRct " ) ;
saferct - > safety = block - > safety ;
BLI_addhead ( & block - > saferct , saferct ) ;
}
/* the block and buttons were positioned in window space as in 2.4x, now
* these menu blocks are regions so we bring it back to region space .
2009-01-04 00:05:40 +00:00
* additionally we add some padding for the menu shadow or rounded menus */
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
ar - > winrct . xmin = block - > minx + MENU_SHADOW_LEFT ;
ar - > winrct . xmax = block - > maxx + MENU_SHADOW_RIGHT ;
ar - > winrct . ymin = block - > miny + MENU_SHADOW_BOTTOM ;
2009-01-04 00:05:40 +00:00
ar - > winrct . ymax = block - > maxy + MENU_SHADOW_TOP + MENU_ROUNDED_TOP ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
block - > minx - = ar - > winrct . xmin ;
block - > maxx - = ar - > winrct . xmin ;
block - > miny - = ar - > winrct . ymin ;
block - > maxy - = ar - > winrct . ymin ;
for ( bt = block - > buttons . first ; bt ; bt = bt - > next ) {
bt - > x1 - = ar - > winrct . xmin ;
bt - > x2 - = ar - > winrct . xmin ;
bt - > y1 - = ar - > winrct . ymin ;
bt - > y2 - = ar - > winrct . ymin ;
}
block - > flag | = UI_BLOCK_LOOP | UI_BLOCK_MOVEMOUSE_QUIT ;
2008-12-10 13:56:54 +00:00
/* adds subwindow */
ED_region_init ( C , ar ) ;
2008-12-15 19:19:39 +00:00
/* get winmat now that we actually have the subwindow */
2008-12-19 14:14:43 +00:00
wmSubWindowSet ( window , ar - > swinid ) ;
2008-12-18 02:56:48 +00:00
wm_subwindow_getmatrix ( window , ar - > swinid , block - > winmat ) ;
2008-12-10 13:56:54 +00:00
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
/* notify change and redraw */
2008-12-16 12:28:00 +00:00
ED_region_tag_redraw ( ar ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
return handle ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
void ui_popup_block_free ( bContext * C , uiPopupBlockHandle * handle )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
2008-12-18 02:56:48 +00:00
ui_remove_temporary_region ( C , CTX_wm_screen ( C ) , handle - > region ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
MEM_freeN ( handle ) ;
}
/***************************** Menu Button ***************************/
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiBlock * ui_block_func_MENU ( bContext * C , uiPopupBlockHandle * handle , void * arg_but )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
uiBut * but = arg_but ;
uiBlock * block ;
uiBut * bt ;
MenuData * md ;
ListBase lb ;
float aspect ;
int width , height , boxh , columns , rows , startx , starty , x1 , y1 , xmax , a ;
/* create the block */
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
block = uiBeginBlock ( C , handle - > region , " menu " , UI_EMBOSSP , UI_HELV ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
block - > dt = UI_EMBOSSP ;
block - > flag = UI_BLOCK_LOOP | UI_BLOCK_REDRAW | UI_BLOCK_NUMSELECT ;
block - > themecol = TH_MENU_ITEM ;
/* compute menu data */
md = decompose_menu_string ( but - > str ) ;
/* columns and row calculation */
columns = ( md - > nitems + 20 ) / 20 ;
if ( columns < 1 )
columns = 1 ;
if ( columns > 8 )
columns = ( md - > nitems + 25 ) / 25 ;
rows = md - > nitems / columns ;
if ( rows < 1 )
rows = 1 ;
while ( rows * columns < md - > nitems )
rows + + ;
/* prevent scaling up of pupmenu */
aspect = but - > aspect ;
if ( aspect < 1.0f )
aspect = 1.0f ;
/* size and location */
if ( md - > title )
width = 1.5 * aspect * strlen ( md - > title ) + UI_GetStringWidth ( block - > curfont , md - > title , ui_translate_menus ( ) ) ;
else
width = 0 ;
for ( a = 0 ; a < md - > nitems ; a + + ) {
xmax = aspect * UI_GetStringWidth ( block - > curfont , md - > items [ a ] . str , ui_translate_menus ( ) ) ;
if ( md - > items [ a ] . icon )
xmax + = 20 * aspect ;
if ( xmax > width )
width = xmax ;
}
width + = 10 ;
if ( width < ( but - > x2 - but - > x1 ) )
width = ( but - > x2 - but - > x1 ) ;
if ( width < 50 )
width = 50 ;
boxh = MENU_BUTTON_HEIGHT ;
height = rows * boxh ;
if ( md - > title )
height + = boxh ;
/* here we go! */
startx = but - > x1 ;
starty = but - > y1 ;
if ( md - > title ) {
uiBut * bt ;
uiSetCurFont ( block , block - > font + 1 ) ;
if ( md - > titleicon ) {
bt = uiDefIconTextBut ( block , LABEL , 0 , md - > titleicon , md - > title , startx , ( short ) ( starty + rows * boxh ) , ( short ) width , ( short ) boxh , NULL , 0.0 , 0.0 , 0 , 0 , " " ) ;
} else {
bt = uiDefBut ( block , LABEL , 0 , md - > title , startx , ( short ) ( starty + rows * boxh ) , ( short ) width , ( short ) boxh , NULL , 0.0 , 0.0 , 0 , 0 , " " ) ;
bt - > flag = UI_TEXT_LEFT ;
}
uiSetCurFont ( block , block - > font ) ;
}
for ( a = 0 ; a < md - > nitems ; a + + ) {
x1 = startx + width * ( ( int ) ( md - > nitems - a - 1 ) / rows ) ;
y1 = starty - boxh * ( rows - ( ( md - > nitems - a - 1 ) % rows ) ) + ( rows * boxh ) ;
if ( strcmp ( md - > items [ md - > nitems - a - 1 ] . str , " %l " ) = = 0 ) {
bt = uiDefBut ( block , SEPR , B_NOP , " " , x1 , y1 , ( short ) ( width - ( rows > 1 ) ) , ( short ) ( boxh - 1 ) , NULL , 0.0 , 0.0 , 0 , 0 , " " ) ;
}
else if ( md - > items [ md - > nitems - a - 1 ] . icon ) {
bt = uiDefIconTextButF ( block , BUTM | FLO , B_NOP , md - > items [ md - > nitems - a - 1 ] . icon , md - > items [ md - > nitems - a - 1 ] . str , x1 , y1 , ( short ) ( width - ( rows > 1 ) ) , ( short ) ( boxh - 1 ) , & handle - > retvalue , ( float ) md - > items [ md - > nitems - a - 1 ] . retval , 0.0 , 0 , 0 , " " ) ;
}
else {
bt = uiDefButF ( block , BUTM | FLO , B_NOP , md - > items [ md - > nitems - a - 1 ] . str , x1 , y1 , ( short ) ( width - ( rows > 1 ) ) , ( short ) ( boxh - 1 ) , & handle - > retvalue , ( float ) md - > items [ md - > nitems - a - 1 ] . retval , 0.0 , 0 , 0 , " " ) ;
}
}
menudata_free ( md ) ;
/* the code up here has flipped locations, because of change of preferred order */
/* thats why we have to switch list order too, to make arrowkeys work */
lb . first = lb . last = NULL ;
bt = block - > buttons . first ;
while ( bt ) {
uiBut * next = bt - > next ;
BLI_remlink ( & block - > buttons , bt ) ;
BLI_addhead ( & lb , bt ) ;
bt = next ;
}
block - > buttons = lb ;
block - > direction = UI_TOP ;
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
uiEndBlock ( C , block ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
return block ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiBlock * ui_block_func_ICONROW ( bContext * C , uiPopupBlockHandle * handle , void * arg_but )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
uiBut * but = arg_but ;
uiBlock * block ;
int a ;
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
block = uiBeginBlock ( C , handle - > region , " menu " , UI_EMBOSSP , UI_HELV ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
block - > flag = UI_BLOCK_LOOP | UI_BLOCK_REDRAW | UI_BLOCK_NUMSELECT ;
block - > themecol = TH_MENU_ITEM ;
for ( a = ( int ) but - > min ; a < = ( int ) but - > max ; a + + ) {
uiDefIconButF ( block , BUTM | FLO , B_NOP , but - > icon + ( a - but - > min ) , 0 , ( short ) ( 18 * a ) , ( short ) ( but - > x2 - but - > x1 - 4 ) , 18 , & handle - > retvalue , ( float ) a , 0.0 , 0 , 0 , " " ) ;
}
block - > direction = UI_TOP ;
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
uiEndBlock ( C , block ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
return block ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiBlock * ui_block_func_ICONTEXTROW ( bContext * C , uiPopupBlockHandle * handle , void * arg_but )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
uiBut * but = arg_but ;
uiBlock * block ;
MenuData * md ;
int width , xmax , ypos , a ;
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
block = uiBeginBlock ( C , handle - > region , " menu " , UI_EMBOSSP , UI_HELV ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
block - > flag = UI_BLOCK_LOOP | UI_BLOCK_REDRAW | UI_BLOCK_NUMSELECT ;
block - > themecol = TH_MENU_ITEM ;
md = decompose_menu_string ( but - > str ) ;
/* size and location */
/* expand menu width to fit labels */
if ( md - > title )
width = 2 * strlen ( md - > title ) + UI_GetStringWidth ( block - > curfont , md - > title , ui_translate_menus ( ) ) ;
else
width = 0 ;
for ( a = 0 ; a < md - > nitems ; a + + ) {
xmax = UI_GetStringWidth ( block - > curfont , md - > items [ a ] . str , ui_translate_menus ( ) ) ;
if ( xmax > width ) width = xmax ;
}
width + = 30 ;
if ( width < 50 ) width = 50 ;
ypos = 1 ;
/* loop through the menu options and draw them out with icons & text labels */
for ( a = 0 ; a < md - > nitems ; a + + ) {
/* add a space if there's a separator (%l) */
if ( strcmp ( md - > items [ a ] . str , " %l " ) = = 0 ) {
ypos + = 3 ;
}
else {
uiDefIconTextButF ( block , BUTM | FLO , B_NOP , ( short ) ( ( but - > icon ) + ( md - > items [ a ] . retval - but - > min ) ) , md - > items [ a ] . str , 0 , ypos , ( short ) width , 19 , & handle - > retvalue , ( float ) md - > items [ a ] . retval , 0.0 , 0 , 0 , " " ) ;
ypos + = 20 ;
}
}
if ( md - > title ) {
uiBut * bt ;
uiSetCurFont ( block , block - > font + 1 ) ;
bt = uiDefBut ( block , LABEL , 0 , md - > title , 0 , ypos , ( short ) width , 19 , NULL , 0.0 , 0.0 , 0 , 0 , " " ) ;
uiSetCurFont ( block , block - > font ) ;
bt - > flag = UI_TEXT_LEFT ;
}
menudata_free ( md ) ;
block - > direction = UI_TOP ;
uiBoundsBlock ( block , 3 ) ;
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
uiEndBlock ( C , block ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
return block ;
}
static void ui_warp_pointer ( short x , short y )
{
/* XXX 2.50 which function to use for this? */
#if 0
/* OSX has very poor mousewarp support, it sends events;
this causes a menu being pressed immediately . . . */
# ifndef __APPLE__
warp_pointer ( x , y ) ;
# endif
# endif
}
/********************* Color Button ****************/
/* picker sizes S hsize, F full size, D spacer, B button/pallette height */
# define SPICK 110.0
# define FPICK 180.0
# define DPICK 6.0
# define BPICK 24.0
# define UI_PALETTE_TOT 16
/* note; in tot+1 the old color is stored */
static float palette [ UI_PALETTE_TOT + 1 ] [ 3 ] = {
{ 0.93 , 0.83 , 0.81 } , { 0.88 , 0.89 , 0.73 } , { 0.69 , 0.81 , 0.57 } , { 0.51 , 0.76 , 0.64 } ,
{ 0.37 , 0.56 , 0.61 } , { 0.33 , 0.29 , 0.55 } , { 0.46 , 0.21 , 0.51 } , { 0.40 , 0.12 , 0.18 } ,
{ 1.0 , 1.0 , 1.0 } , { 0.85 , 0.85 , 0.85 } , { 0.7 , 0.7 , 0.7 } , { 0.56 , 0.56 , 0.56 } ,
{ 0.42 , 0.42 , 0.42 } , { 0.28 , 0.28 , 0.28 } , { 0.14 , 0.14 , 0.14 } , { 0.0 , 0.0 , 0.0 }
} ;
/* for picker, while editing hsv */
void ui_set_but_hsv ( uiBut * but )
{
float col [ 3 ] ;
hsv_to_rgb ( but - > hsv [ 0 ] , but - > hsv [ 1 ] , but - > hsv [ 2 ] , col , col + 1 , col + 2 ) ;
ui_set_but_vectorf ( but , col ) ;
}
static void update_picker_hex ( uiBlock * block , float * rgb )
{
uiBut * bt ;
char col [ 16 ] ;
sprintf ( col , " %02X%02X%02X " , ( unsigned int ) ( rgb [ 0 ] * 255.0 ) , ( unsigned int ) ( rgb [ 1 ] * 255.0 ) , ( unsigned int ) ( rgb [ 2 ] * 255.0 ) ) ;
// this updates button strings, is hackish... but button pointers are on stack of caller function
for ( bt = block - > buttons . first ; bt ; bt = bt - > next ) {
if ( strcmp ( bt - > str , " Hex: " ) = = 0 ) {
strcpy ( bt - > poin , col ) ;
ui_check_but ( bt ) ;
break ;
}
}
}
void ui_update_block_buts_hsv ( uiBlock * block , float * hsv )
{
uiBut * bt ;
float r , g , b ;
float rgb [ 3 ] ;
// this updates button strings, is hackish... but button pointers are on stack of caller function
hsv_to_rgb ( hsv [ 0 ] , hsv [ 1 ] , hsv [ 2 ] , & r , & g , & b ) ;
rgb [ 0 ] = r ; rgb [ 1 ] = g ; rgb [ 2 ] = b ;
update_picker_hex ( block , rgb ) ;
for ( bt = block - > buttons . first ; bt ; bt = bt - > next ) {
if ( bt - > type = = HSVCUBE ) {
VECCOPY ( bt - > hsv , hsv ) ;
ui_set_but_hsv ( bt ) ;
}
else if ( bt - > str [ 1 ] = = ' ' ) {
if ( bt - > str [ 0 ] = = ' R ' ) {
ui_set_but_val ( bt , r ) ;
}
else if ( bt - > str [ 0 ] = = ' G ' ) {
ui_set_but_val ( bt , g ) ;
}
else if ( bt - > str [ 0 ] = = ' B ' ) {
ui_set_but_val ( bt , b ) ;
}
else if ( bt - > str [ 0 ] = = ' H ' ) {
ui_set_but_val ( bt , hsv [ 0 ] ) ;
}
else if ( bt - > str [ 0 ] = = ' S ' ) {
ui_set_but_val ( bt , hsv [ 1 ] ) ;
}
else if ( bt - > str [ 0 ] = = ' V ' ) {
ui_set_but_val ( bt , hsv [ 2 ] ) ;
}
}
}
}
static void ui_update_block_buts_hex ( uiBlock * block , char * hexcol )
{
uiBut * bt ;
float r = 0 , g = 0 , b = 0 ;
float h , s , v ;
// this updates button strings, is hackish... but button pointers are on stack of caller function
hex_to_rgb ( hexcol , & r , & g , & b ) ;
rgb_to_hsv ( r , g , b , & h , & s , & v ) ;
for ( bt = block - > buttons . first ; bt ; bt = bt - > next ) {
if ( bt - > type = = HSVCUBE ) {
bt - > hsv [ 0 ] = h ;
bt - > hsv [ 1 ] = s ;
bt - > hsv [ 2 ] = v ;
ui_set_but_hsv ( bt ) ;
}
else if ( bt - > str [ 1 ] = = ' ' ) {
if ( bt - > str [ 0 ] = = ' R ' ) {
ui_set_but_val ( bt , r ) ;
}
else if ( bt - > str [ 0 ] = = ' G ' ) {
ui_set_but_val ( bt , g ) ;
}
else if ( bt - > str [ 0 ] = = ' B ' ) {
ui_set_but_val ( bt , b ) ;
}
else if ( bt - > str [ 0 ] = = ' H ' ) {
ui_set_but_val ( bt , h ) ;
}
else if ( bt - > str [ 0 ] = = ' S ' ) {
ui_set_but_val ( bt , s ) ;
}
else if ( bt - > str [ 0 ] = = ' V ' ) {
ui_set_but_val ( bt , v ) ;
}
}
}
}
/* bt1 is palette but, col1 is original color */
/* callback to copy from/to palette */
2008-12-10 19:22:10 +00:00
static void do_palette_cb ( bContext * C , void * bt1 , void * col1 )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
uiBut * but1 = ( uiBut * ) bt1 ;
float * col = ( float * ) col1 ;
float * fp , hsv [ 3 ] ;
fp = ( float * ) but1 - > poin ;
/* XXX 2.50 bad access, how to solve?
*
if ( ( get_qual ( ) & LR_CTRLKEY ) ) {
VECCOPY ( fp , col ) ;
}
else */ {
VECCOPY ( col , fp ) ;
}
rgb_to_hsv ( col [ 0 ] , col [ 1 ] , col [ 2 ] , hsv , hsv + 1 , hsv + 2 ) ;
ui_update_block_buts_hsv ( but1 - > block , hsv ) ;
update_picker_hex ( but1 - > block , col ) ;
}
/* bt1 is num but, hsv1 is pointer to original color in hsv space*/
/* callback to handle changes in num-buts in picker */
2008-12-10 19:22:10 +00:00
static void do_palette1_cb ( bContext * C , void * bt1 , void * hsv1 )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
uiBut * but1 = ( uiBut * ) bt1 ;
float * hsv = ( float * ) hsv1 ;
float * fp = NULL ;
if ( but1 - > str [ 1 ] = = ' ' ) {
if ( but1 - > str [ 0 ] = = ' R ' ) fp = ( float * ) but1 - > poin ;
else if ( but1 - > str [ 0 ] = = ' G ' ) fp = ( ( float * ) but1 - > poin ) - 1 ;
else if ( but1 - > str [ 0 ] = = ' B ' ) fp = ( ( float * ) but1 - > poin ) - 2 ;
}
if ( fp ) {
rgb_to_hsv ( fp [ 0 ] , fp [ 1 ] , fp [ 2 ] , hsv , hsv + 1 , hsv + 2 ) ;
}
ui_update_block_buts_hsv ( but1 - > block , hsv ) ;
}
/* bt1 is num but, col1 is pointer to original color */
/* callback to handle changes in num-buts in picker */
2008-12-10 19:22:10 +00:00
static void do_palette2_cb ( bContext * C , void * bt1 , void * col1 )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
uiBut * but1 = ( uiBut * ) bt1 ;
float * rgb = ( float * ) col1 ;
float * fp = NULL ;
if ( but1 - > str [ 1 ] = = ' ' ) {
if ( but1 - > str [ 0 ] = = ' H ' ) fp = ( float * ) but1 - > poin ;
else if ( but1 - > str [ 0 ] = = ' S ' ) fp = ( ( float * ) but1 - > poin ) - 1 ;
else if ( but1 - > str [ 0 ] = = ' V ' ) fp = ( ( float * ) but1 - > poin ) - 2 ;
}
if ( fp ) {
hsv_to_rgb ( fp [ 0 ] , fp [ 1 ] , fp [ 2 ] , rgb , rgb + 1 , rgb + 2 ) ;
}
ui_update_block_buts_hsv ( but1 - > block , fp ) ;
}
2008-12-10 19:22:10 +00:00
static void do_palette_hex_cb ( bContext * C , void * bt1 , void * hexcl )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
uiBut * but1 = ( uiBut * ) bt1 ;
char * hexcol = ( char * ) hexcl ;
ui_update_block_buts_hex ( but1 - > block , hexcol ) ;
}
/* used for both 3d view and image window */
2008-12-10 19:22:10 +00:00
static void do_palette_sample_cb ( bContext * C , void * bt1 , void * col1 ) /* frontbuf */
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
/* XXX 2.50 this should become an operator? */
#if 0
uiBut * but1 = ( uiBut * ) bt1 ;
uiBut * but ;
float tempcol [ 4 ] ;
int x = 0 , y = 0 ;
short mval [ 2 ] ;
float hsv [ 3 ] ;
short capturing ;
int oldcursor ;
Window * win ;
unsigned short dev ;
oldcursor = get_cursor ( ) ;
win = winlay_get_active_window ( ) ;
while ( get_mbut ( ) & L_MOUSE ) UI_wait_for_statechange ( ) ;
SetBlenderCursor ( BC_EYEDROPPER_CURSOR ) ;
/* loop and wait for a mouse click */
capturing = TRUE ;
while ( capturing ) {
char ascii ;
short val ;
dev = extern_qread_ext ( & val , & ascii ) ;
if ( dev = = INPUTCHANGE ) break ;
if ( get_mbut ( ) & R_MOUSE ) break ;
else if ( get_mbut ( ) & L_MOUSE ) {
uiGetMouse ( mywinget ( ) , mval ) ;
x = mval [ 0 ] ; y = mval [ 1 ] ;
capturing = FALSE ;
break ;
}
else if ( dev = = ESCKEY ) break ;
}
window_set_cursor ( win , oldcursor ) ;
if ( capturing ) return ;
if ( x < 0 | | y < 0 ) return ;
/* if we've got a glick, use OpenGL to sample the color under the mouse pointer */
glReadBuffer ( GL_FRONT ) ;
glReadPixels ( x , y , 1 , 1 , GL_RGBA , GL_FLOAT , tempcol ) ;
glReadBuffer ( GL_BACK ) ;
/* and send that color back to the picker */
rgb_to_hsv ( tempcol [ 0 ] , tempcol [ 1 ] , tempcol [ 2 ] , hsv , hsv + 1 , hsv + 2 ) ;
ui_update_block_buts_hsv ( but1 - > block , hsv ) ;
update_picker_hex ( but1 - > block , tempcol ) ;
for ( but = but1 - > block - > buttons . first ; but ; but = but - > next ) {
ui_check_but ( but ) ;
ui_draw_but ( but ) ;
}
but = but1 - > block - > buttons . first ;
ui_block_flush_back ( but - > block ) ;
# endif
}
/* color picker, Gimp version. mode: 'f' = floating panel, 'p' = popup */
/* col = read/write to, hsv/old/hexcol = memory for temporal use */
void uiBlockPickerButtons ( uiBlock * block , float * col , float * hsv , float * old , char * hexcol , char mode , short retval )
{
uiBut * bt ;
float h , offs ;
int a ;
VECCOPY ( old , col ) ; // old color stored there, for palette_cb to work
// the cube intersection
bt = uiDefButF ( block , HSVCUBE , retval , " " , 0 , DPICK + BPICK , FPICK , FPICK , col , 0.0 , 0.0 , 2 , 0 , " " ) ;
uiButSetFlag ( bt , UI_NO_HILITE ) ;
bt = uiDefButF ( block , HSVCUBE , retval , " " , 0 , 0 , FPICK , BPICK , col , 0.0 , 0.0 , 3 , 0 , " " ) ;
uiButSetFlag ( bt , UI_NO_HILITE ) ;
// palette
uiBlockSetEmboss ( block , UI_EMBOSSP ) ;
bt = uiDefButF ( block , COL , retval , " " , FPICK + DPICK , 0 , BPICK , BPICK , old , 0.0 , 0.0 , - 1 , 0 , " Old color, click to restore " ) ;
uiButSetFunc ( bt , do_palette_cb , bt , col ) ;
uiDefButF ( block , COL , retval , " " , FPICK + DPICK , BPICK + DPICK , BPICK , 60 - BPICK - DPICK , col , 0.0 , 0.0 , - 1 , 0 , " Active color " ) ;
h = ( DPICK + BPICK + FPICK - 64 ) / ( UI_PALETTE_TOT / 2.0 ) ;
uiBlockBeginAlign ( block ) ;
for ( a = - 1 + UI_PALETTE_TOT / 2 ; a > = 0 ; a - - ) {
bt = uiDefButF ( block , COL , retval , " " , FPICK + DPICK , 65.0 + ( float ) a * h , BPICK / 2 , h , palette [ a + UI_PALETTE_TOT / 2 ] , 0.0 , 0.0 , - 1 , 0 , " Click to choose, hold CTRL to store in palette " ) ;
uiButSetFunc ( bt , do_palette_cb , bt , col ) ;
bt = uiDefButF ( block , COL , retval , " " , FPICK + DPICK + BPICK / 2 , 65.0 + ( float ) a * h , BPICK / 2 , h , palette [ a ] , 0.0 , 0.0 , - 1 , 0 , " Click to choose, hold CTRL to store in palette " ) ;
uiButSetFunc ( bt , do_palette_cb , bt , col ) ;
}
uiBlockEndAlign ( block ) ;
uiBlockSetEmboss ( block , UI_EMBOSS ) ;
// buttons
rgb_to_hsv ( col [ 0 ] , col [ 1 ] , col [ 2 ] , hsv , hsv + 1 , hsv + 2 ) ;
sprintf ( hexcol , " %02X%02X%02X " , ( unsigned int ) ( col [ 0 ] * 255.0 ) , ( unsigned int ) ( col [ 1 ] * 255.0 ) , ( unsigned int ) ( col [ 2 ] * 255.0 ) ) ;
offs = FPICK + 2 * DPICK + BPICK ;
/* note; made this a TOG now, with NULL pointer. Is because BUT now gets handled with a afterfunc */
bt = uiDefIconTextBut ( block , TOG , UI_RETURN_OK , ICON_EYEDROPPER , " Sample " , offs + 55 , 170 , 85 , 20 , NULL , 0 , 0 , 0 , 0 , " Sample the color underneath the following mouse click (ESC or RMB to cancel) " ) ;
uiButSetFunc ( bt , do_palette_sample_cb , bt , col ) ;
uiButSetFlag ( bt , UI_TEXT_LEFT ) ;
bt = uiDefBut ( block , TEX , retval , " Hex: " , offs , 140 , 140 , 20 , hexcol , 0 , 8 , 0 , 0 , " Hex triplet for color (#RRGGBB) " ) ;
uiButSetFunc ( bt , do_palette_hex_cb , bt , hexcol ) ;
uiBlockBeginAlign ( block ) ;
bt = uiDefButF ( block , NUMSLI , retval , " R " , offs , 110 , 140 , 20 , col , 0.0 , 1.0 , 10 , 3 , " " ) ;
uiButSetFunc ( bt , do_palette1_cb , bt , hsv ) ;
bt = uiDefButF ( block , NUMSLI , retval , " G " , offs , 90 , 140 , 20 , col + 1 , 0.0 , 1.0 , 10 , 3 , " " ) ;
uiButSetFunc ( bt , do_palette1_cb , bt , hsv ) ;
bt = uiDefButF ( block , NUMSLI , retval , " B " , offs , 70 , 140 , 20 , col + 2 , 0.0 , 1.0 , 10 , 3 , " " ) ;
uiButSetFunc ( bt , do_palette1_cb , bt , hsv ) ;
uiBlockBeginAlign ( block ) ;
bt = uiDefButF ( block , NUMSLI , retval , " H " , offs , 40 , 140 , 20 , hsv , 0.0 , 1.0 , 10 , 3 , " " ) ;
uiButSetFunc ( bt , do_palette2_cb , bt , col ) ;
bt = uiDefButF ( block , NUMSLI , retval , " S " , offs , 20 , 140 , 20 , hsv + 1 , 0.0 , 1.0 , 10 , 3 , " " ) ;
uiButSetFunc ( bt , do_palette2_cb , bt , col ) ;
bt = uiDefButF ( block , NUMSLI , retval , " V " , offs , 0 , 140 , 20 , hsv + 2 , 0.0 , 1.0 , 10 , 3 , " " ) ;
uiButSetFunc ( bt , do_palette2_cb , bt , col ) ;
uiBlockEndAlign ( block ) ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiBlock * ui_block_func_COL ( bContext * C , uiPopupBlockHandle * handle , void * arg_but )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
uiBut * but = arg_but ;
uiBlock * block ;
static float hsvcol [ 3 ] , oldcol [ 3 ] ;
static char hexcol [ 128 ] ;
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
block = uiBeginBlock ( C , handle - > region , " colorpicker " , UI_EMBOSS , UI_HELV ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
block - > flag = UI_BLOCK_LOOP | UI_BLOCK_REDRAW | UI_BLOCK_KEEP_OPEN ;
block - > themecol = TH_BUT_NUM ;
VECCOPY ( handle - > retvec , but - > editvec ) ;
uiBlockPickerButtons ( block , handle - > retvec , hsvcol , oldcol , hexcol , ' p ' , 0 ) ;
/* and lets go */
block - > direction = UI_TOP ;
uiBoundsBlock ( block , 3 ) ;
return block ;
}
/* ******************** PUPmenu ****************** */
static int pupmenu_set = 0 ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
void uiPupMenuSetActive ( int val )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
pupmenu_set = val ;
}
/* value== -1 read, otherwise set */
static int pupmenu_memory ( char * str , int value )
{
static char mem [ 256 ] , first = 1 ;
int val = 0 , nr = 0 ;
if ( first ) {
memset ( mem , 0 , 256 ) ;
first = 0 ;
}
while ( str [ nr ] ) {
val + = str [ nr ] ;
nr + + ;
}
if ( value > = 0 ) mem [ val & 255 ] = value ;
else return mem [ val & 255 ] ;
return 0 ;
}
# define PUP_LABELH 6
typedef struct uiPupMenuInfo {
char * instr ;
int mx , my ;
int startx , starty ;
int maxrow ;
} uiPupMenuInfo ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiBlock * ui_block_func_PUPMENU ( bContext * C , uiPopupBlockHandle * handle , void * arg_info )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
uiBlock * block ;
uiPupMenuInfo * info ;
int columns , rows , mousemove [ 2 ] = { 0 , 0 } , mousewarp = 0 ;
int width , height , xmax , ymax , maxrow ;
int a , startx , starty , endx , endy , x1 , y1 ;
int lastselected ;
MenuData * md ;
info = arg_info ;
maxrow = info - > maxrow ;
height = 0 ;
/* block stuff first, need to know the font */
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
block = uiBeginBlock ( C , handle - > region , " menu " , UI_EMBOSSP , UI_HELV ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
uiBlockSetFlag ( block , UI_BLOCK_LOOP | UI_BLOCK_REDRAW | UI_BLOCK_RET_1 | UI_BLOCK_NUMSELECT ) ;
block - > themecol = TH_MENU_ITEM ;
2009-01-10 14:03:00 +00:00
block - > direction = UI_DOWN ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
md = decompose_menu_string ( info - > instr ) ;
rows = md - > nitems ;
columns = 1 ;
/* size and location, title slightly bigger for bold */
if ( md - > title ) {
width = 2 * strlen ( md - > title ) + UI_GetStringWidth ( uiBlockGetCurFont ( block ) , md - > title , ui_translate_buttons ( ) ) ;
width / = columns ;
}
else width = 0 ;
for ( a = 0 ; a < md - > nitems ; a + + ) {
xmax = UI_GetStringWidth ( uiBlockGetCurFont ( block ) , md - > items [ a ] . str , ui_translate_buttons ( ) ) ;
if ( xmax > width ) width = xmax ;
if ( strcmp ( md - > items [ a ] . str , " %l " ) = = 0 ) height + = PUP_LABELH ;
else height + = MENU_BUTTON_HEIGHT ;
}
width + = 10 ;
if ( width < 50 ) width = 50 ;
2008-12-18 02:56:48 +00:00
wm_window_get_size ( CTX_wm_window ( C ) , & xmax , & ymax ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
/* set first item */
lastselected = 0 ;
if ( pupmenu_set ) {
lastselected = pupmenu_set - 1 ;
pupmenu_set = 0 ;
}
else if ( md - > nitems > 1 ) {
lastselected = pupmenu_memory ( info - > instr , - 1 ) ;
}
startx = info - > mx - ( 0.8 * ( width ) ) ;
starty = info - > my - height + MENU_BUTTON_HEIGHT / 2 ;
if ( lastselected > = 0 & & lastselected < md - > nitems ) {
for ( a = 0 ; a < md - > nitems ; a + + ) {
if ( a = = lastselected ) break ;
if ( strcmp ( md - > items [ a ] . str , " %l " ) = = 0 ) starty + = PUP_LABELH ;
else starty + = MENU_BUTTON_HEIGHT ;
}
//starty= info->my-height+MENU_BUTTON_HEIGHT/2+lastselected*MENU_BUTTON_HEIGHT;
}
if ( startx < 10 ) {
startx = 10 ;
}
if ( starty < 10 ) {
mousemove [ 1 ] = 10 - starty ;
starty = 10 ;
}
endx = startx + width * columns ;
endy = starty + height ;
if ( endx > xmax ) {
endx = xmax - 10 ;
startx = endx - width * columns ;
}
if ( endy > ymax - 20 ) {
mousemove [ 1 ] = ymax - endy - 20 ;
endy = ymax - 20 ;
starty = endy - height ;
}
if ( mousemove [ 0 ] | | mousemove [ 1 ] ) {
ui_warp_pointer ( info - > mx + mousemove [ 0 ] , info - > my + mousemove [ 1 ] ) ;
mousemove [ 0 ] = info - > mx ;
mousemove [ 1 ] = info - > my ;
mousewarp = 1 ;
}
/* here we go! */
if ( md - > title ) {
uiBut * bt ;
char titlestr [ 256 ] ;
uiSetCurFont ( block , UI_HELVB ) ;
if ( md - > titleicon ) {
width + = 20 ;
sprintf ( titlestr , " %s " , md - > title ) ;
uiDefIconTextBut ( block , LABEL , 0 , md - > titleicon , titlestr , startx , ( short ) ( starty + height ) , width , MENU_BUTTON_HEIGHT , NULL , 0.0 , 0.0 , 0 , 0 , " " ) ;
}
else {
bt = uiDefBut ( block , LABEL , 0 , md - > title , startx , ( short ) ( starty + height ) , columns * width , MENU_BUTTON_HEIGHT , NULL , 0.0 , 0.0 , 0 , 0 , " " ) ;
bt - > flag = UI_TEXT_LEFT ;
}
uiSetCurFont ( block , UI_HELV ) ;
2009-01-04 00:05:40 +00:00
//uiDefBut(block, SEPR, 0, "", startx, (short)(starty+height)-MENU_SEPR_HEIGHT, width, MENU_SEPR_HEIGHT, NULL, 0.0, 0.0, 0, 0, "");
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
}
x1 = startx + width * ( ( int ) a / rows ) ;
2009-01-04 00:05:40 +00:00
y1 = starty + height - MENU_BUTTON_HEIGHT ; // - MENU_SEPR_HEIGHT;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
for ( a = 0 ; a < md - > nitems ; a + + ) {
char * name = md - > items [ a ] . str ;
int icon = md - > items [ a ] . icon ;
if ( strcmp ( name , " %l " ) = = 0 ) {
uiDefBut ( block , SEPR , B_NOP , " " , x1 , y1 , width , PUP_LABELH , NULL , 0 , 0.0 , 0 , 0 , " " ) ;
y1 - = PUP_LABELH ;
}
else if ( icon ) {
uiDefIconButF ( block , BUTM , B_NOP , icon , x1 , y1 , width + 16 , MENU_BUTTON_HEIGHT - 1 , & handle - > retvalue , ( float ) md - > items [ a ] . retval , 0.0 , 0 , 0 , " " ) ;
y1 - = MENU_BUTTON_HEIGHT ;
}
else {
uiDefButF ( block , BUTM , B_NOP , name , x1 , y1 , width , MENU_BUTTON_HEIGHT - 1 , & handle - > retvalue , ( float ) md - > items [ a ] . retval , 0.0 , 0 , 0 , " " ) ;
y1 - = MENU_BUTTON_HEIGHT ;
}
}
uiBoundsBlock ( block , 1 ) ;
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
uiEndBlock ( C , block ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
menudata_free ( md ) ;
/* XXX 2.5 need to store last selected */
#if 0
/* calculate last selected */
if ( event & ui_return_ok ) {
lastselected = 0 ;
for ( a = 0 ; a < md - > nitems ; a + + ) {
if ( val = = md - > items [ a ] . retval ) lastselected = a ;
}
pupmenu_memory ( info - > instr , lastselected ) ;
}
# endif
/* XXX 2.5 need to warp back */
#if 0
if ( mousemove [ 1 ] & & ( event & ui_return_out ) = = 0 )
ui_warp_pointer ( mousemove [ 0 ] , mousemove [ 1 ] ) ;
return val ;
# endif
return block ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiBlock * ui_block_func_PUPMENUCOL ( bContext * C , uiPopupBlockHandle * handle , void * arg_info )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
uiBlock * block ;
uiPupMenuInfo * info ;
int columns , rows , mousemove [ 2 ] = { 0 , 0 } , mousewarp ;
int width , height , xmax , ymax , maxrow ;
int a , startx , starty , endx , endy , x1 , y1 ;
float fvalue ;
MenuData * md ;
info = arg_info ;
maxrow = info - > maxrow ;
height = 0 ;
/* block stuff first, need to know the font */
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
block = uiBeginBlock ( C , handle - > region , " menu " , UI_EMBOSSP , UI_HELV ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
uiBlockSetFlag ( block , UI_BLOCK_LOOP | UI_BLOCK_REDRAW | UI_BLOCK_RET_1 | UI_BLOCK_NUMSELECT ) ;
block - > themecol = TH_MENU_ITEM ;
2009-01-10 14:03:00 +00:00
block - > direction = UI_DOWN ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
md = decompose_menu_string ( info - > instr ) ;
/* columns and row calculation */
columns = ( md - > nitems + maxrow ) / maxrow ;
if ( columns < 1 ) columns = 1 ;
if ( columns > 8 ) {
maxrow + = 5 ;
columns = ( md - > nitems + maxrow ) / maxrow ;
}
rows = ( int ) md - > nitems / columns ;
if ( rows < 1 ) rows = 1 ;
while ( rows * columns < ( md - > nitems + columns ) ) rows + + ;
/* size and location, title slightly bigger for bold */
if ( md - > title ) {
width = 2 * strlen ( md - > title ) + UI_GetStringWidth ( uiBlockGetCurFont ( block ) , md - > title , ui_translate_buttons ( ) ) ;
width / = columns ;
}
else width = 0 ;
for ( a = 0 ; a < md - > nitems ; a + + ) {
xmax = UI_GetStringWidth ( uiBlockGetCurFont ( block ) , md - > items [ a ] . str , ui_translate_buttons ( ) ) ;
if ( xmax > width ) width = xmax ;
}
width + = 10 ;
if ( width < 50 ) width = 50 ;
height = rows * MENU_BUTTON_HEIGHT ;
if ( md - > title ) height + = MENU_BUTTON_HEIGHT ;
2008-12-18 02:56:48 +00:00
wm_window_get_size ( CTX_wm_window ( C ) , & xmax , & ymax ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
/* find active item */
fvalue = handle - > retvalue ;
for ( a = 0 ; a < md - > nitems ; a + + ) {
if ( md - > items [ a ] . retval = = ( int ) fvalue ) break ;
}
/* no active item? */
if ( a = = md - > nitems ) {
if ( md - > title ) a = - 1 ;
else a = 0 ;
}
if ( a > 0 )
startx = info - > mx - width / 2 - ( ( int ) ( a ) / rows ) * width ;
else
startx = info - > mx - width / 2 ;
starty = info - > my - height + MENU_BUTTON_HEIGHT / 2 + ( ( a ) % rows ) * MENU_BUTTON_HEIGHT ;
if ( md - > title ) starty + = MENU_BUTTON_HEIGHT ;
if ( startx < 10 ) {
mousemove [ 0 ] = 10 - startx ;
startx = 10 ;
}
if ( starty < 10 ) {
mousemove [ 1 ] = 10 - starty ;
starty = 10 ;
}
endx = startx + width * columns ;
endy = starty + height ;
if ( endx > xmax ) {
mousemove [ 0 ] = xmax - endx - 10 ;
endx = xmax - 10 ;
startx = endx - width * columns ;
}
if ( endy > ymax ) {
mousemove [ 1 ] = ymax - endy - 10 ;
endy = ymax - 10 ;
starty = endy - height ;
}
if ( mousemove [ 0 ] | | mousemove [ 1 ] ) {
ui_warp_pointer ( info - > mx + mousemove [ 0 ] , info - > my + mousemove [ 1 ] ) ;
mousemove [ 0 ] = info - > mx ;
mousemove [ 1 ] = info - > my ;
mousewarp = 1 ;
}
/* here we go! */
if ( md - > title ) {
uiBut * bt ;
uiSetCurFont ( block , UI_HELVB ) ;
if ( md - > titleicon ) {
}
else {
bt = uiDefBut ( block , LABEL , 0 , md - > title , startx , ( short ) ( starty + rows * MENU_BUTTON_HEIGHT ) , columns * width , MENU_BUTTON_HEIGHT , NULL , 0.0 , 0.0 , 0 , 0 , " " ) ;
bt - > flag = UI_TEXT_LEFT ;
}
uiSetCurFont ( block , UI_HELV ) ;
}
for ( a = 0 ; a < md - > nitems ; a + + ) {
char * name = md - > items [ a ] . str ;
int icon = md - > items [ a ] . icon ;
x1 = startx + width * ( ( int ) a / rows ) ;
y1 = starty - MENU_BUTTON_HEIGHT * ( a % rows ) + ( rows - 1 ) * MENU_BUTTON_HEIGHT ;
if ( strcmp ( name , " %l " ) = = 0 ) {
uiDefBut ( block , SEPR , B_NOP , " " , x1 , y1 , width , PUP_LABELH , NULL , 0 , 0.0 , 0 , 0 , " " ) ;
y1 - = PUP_LABELH ;
}
else if ( icon ) {
uiDefIconButF ( block , BUTM , B_NOP , icon , x1 , y1 , width + 16 , MENU_BUTTON_HEIGHT - 1 , & handle - > retvalue , ( float ) md - > items [ a ] . retval , 0.0 , 0 , 0 , " " ) ;
y1 - = MENU_BUTTON_HEIGHT ;
}
else {
uiDefButF ( block , BUTM , B_NOP , name , x1 , y1 , width , MENU_BUTTON_HEIGHT - 1 , & handle - > retvalue , ( float ) md - > items [ a ] . retval , 0.0 , 0 , 0 , " " ) ;
y1 - = MENU_BUTTON_HEIGHT ;
}
}
uiBoundsBlock ( block , 1 ) ;
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
uiEndBlock ( C , block ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
menudata_free ( md ) ;
/* XXX 2.5 need to warp back */
#if 0
if ( ( event & UI_RETURN_OUT ) = = 0 )
ui_warp_pointer ( mousemove [ 0 ] , mousemove [ 1 ] ) ;
# endif
return block ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
/************************** Menu Definitions ***************************/
2009-01-25 20:22:05 +00:00
/* prototype */
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
static uiBlock * ui_block_func_MENU_ITEM ( bContext * C , uiPopupBlockHandle * handle , void * arg_info ) ;
2009-01-25 20:22:05 +00:00
# define MAX_MENU_STR 64
/* type, internal */
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
# define MENU_ITEM_TITLE 0
# define MENU_ITEM_ITEM 1
# define MENU_ITEM_OPNAME 2
# define MENU_ITEM_OPNAME_BOOL 3
# define MENU_ITEM_OPNAME_ENUM 4
2009-02-04 11:52:16 +00:00
# define MENU_ITEM_OPNAME_INT 5
# define MENU_ITEM_OPNAME_FLOAT 6
# define MENU_ITEM_RNA_BOOL 7
# define MENU_ITEM_RNA_ENUM 8
# define MENU_ITEM_LEVEL 9
# define MENU_ITEM_LEVEL_OPNAME_ENUM 10
# define MENU_ITEM_LEVEL_RNA_ENUM 11
# define MENU_ITEM_SEPARATOR 12
2009-01-25 20:22:05 +00:00
struct uiMenuItem {
struct uiMenuItem * next , * prev ;
int type ;
int icon ;
char name [ MAX_MENU_STR ] ;
char * opname ; /* static string */
char * propname ; /* static string */
2009-02-04 11:52:16 +00:00
int retval , enumval , boolval , intval ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
float fltval ;
2009-01-25 20:22:05 +00:00
int opcontext ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiMenuHandleFunc eventfunc ;
2009-01-25 20:22:05 +00:00
void * argv ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiMenuCreateFunc newlevel ;
PointerRNA rnapoin ;
2009-01-25 20:22:05 +00:00
ListBase items ;
} ;
typedef struct uiMenuInfo {
uiMenuItem * head ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
int mx , my , popup , slideout ;
2009-01-25 20:22:05 +00:00
int startx , starty ;
} uiMenuInfo ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
/************************ Menu Definitions to uiBlocks ***********************/
2009-01-25 20:22:05 +00:00
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
const char * ui_menu_enumpropname ( char * opname , const char * propname , int retval )
2009-01-25 20:22:05 +00:00
{
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
wmOperatorType * ot = WM_operatortype_find ( opname ) ;
PointerRNA ptr ;
PropertyRNA * prop ;
if ( ! ot | | ! ot - > srna )
return " " ;
2009-01-25 20:22:05 +00:00
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
RNA_pointer_create ( NULL , ot - > srna , NULL , & ptr ) ;
prop = RNA_struct_find_property ( & ptr , propname ) ;
2009-01-25 20:22:05 +00:00
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
if ( prop ) {
const EnumPropertyItem * item ;
int totitem , i ;
RNA_property_enum_items ( & ptr , prop , & item , & totitem ) ;
for ( i = 0 ; i < totitem ; i + + ) {
if ( item [ i ] . value = = retval )
return item [ i ] . name ;
}
}
2009-01-25 20:22:05 +00:00
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
return " " ;
2009-01-25 20:22:05 +00:00
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
/* make a menu level from enum properties */
static void menu_item_enum_opname_menu ( bContext * C , uiMenuItem * head , void * arg )
2009-01-25 20:22:05 +00:00
{
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiBut * but = arg ; /* parent caller */
char * opname = but - > func_arg1 ;
char * propname = but - > func_arg2 ;
2009-01-25 20:22:05 +00:00
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiMenuItemsEnumO ( head , opname , propname ) ;
2009-01-25 20:22:05 +00:00
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
static void menu_item_enum_rna_menu ( bContext * C , uiMenuItem * head , void * arg )
2009-01-25 20:22:05 +00:00
{
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiBut * but = arg ; /* parent caller */
char * propname = but - > func_arg1 ;
2009-01-25 20:22:05 +00:00
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiMenuItemsEnumR ( head , & but - > rnapoin , propname ) ;
2009-01-25 20:22:05 +00:00
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
static uiBlock * ui_block_func_MENU_ITEM ( bContext * C , uiPopupBlockHandle * handle , void * arg_info )
2009-01-25 20:22:05 +00:00
{
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiBlock * block ;
uiBut * but ;
uiMenuInfo * info = arg_info ;
uiMenuItem * head , * item ;
ScrArea * sa ;
ARegion * ar ;
static int counter = 0 ;
int width , height , icon ;
int startx , starty , x1 , y1 ;
char str [ 16 ] ;
2009-01-25 20:22:05 +00:00
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
head = info - > head ;
height = 0 ;
/* block stuff first, need to know the font */
sprintf ( str , " tb %d " , counter + + ) ;
block = uiBeginBlock ( C , handle - > region , str , UI_EMBOSSP , UI_HELV ) ;
uiBlockSetButmFunc ( block , head - > eventfunc , head - > argv ) ;
block - > themecol = TH_MENU_ITEM ;
block - > direction = UI_DOWN ;
2009-01-25 20:22:05 +00:00
2009-02-04 11:52:16 +00:00
width = 50 ; // fixed with, uiMenuPopupBoundsBlock will compute actual width
2009-01-25 20:22:05 +00:00
for ( item = head - > items . first ; item ; item = item - > next ) {
if ( 0 ) height + = PUP_LABELH ; // XXX sepr line
else height + = MENU_BUTTON_HEIGHT ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
2009-02-04 11:52:16 +00:00
startx = 0 ;
starty = 0 ;
2009-01-25 20:22:05 +00:00
/* here we go! */
if ( head - > name [ 0 ] ) {
char titlestr [ 256 ] ;
uiSetCurFont ( block , UI_HELVB ) ;
if ( head - > icon ) {
width + = 20 ;
sprintf ( titlestr , " %s " , head - > name ) ;
uiDefIconTextBut ( block , LABEL , 0 , head - > icon , titlestr , startx , ( short ) ( starty + height ) , width , MENU_BUTTON_HEIGHT , NULL , 0.0 , 0.0 , 0 , 0 , " " ) ;
}
else {
but = uiDefBut ( block , LABEL , 0 , head - > name , startx , ( short ) ( starty + height ) , width , MENU_BUTTON_HEIGHT , NULL , 0.0 , 0.0 , 0 , 0 , " " ) ;
but - > flag = UI_TEXT_LEFT ;
}
uiSetCurFont ( block , UI_HELV ) ;
//uiDefBut(block, SEPR, 0, "", startx, (short)(starty+height)-MENU_SEPR_HEIGHT, width, MENU_SEPR_HEIGHT, NULL, 0.0, 0.0, 0, 0, "");
}
x1 = startx ;
y1 = starty + height - MENU_BUTTON_HEIGHT ; // - MENU_SEPR_HEIGHT;
for ( item = head - > items . first ; item ; item = item - > next ) {
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
if ( item - > type = = MENU_ITEM_LEVEL ) {
uiDefIconTextMenuBut ( block , item - > newlevel , NULL , ICON_RIGHTARROW_THIN , item - > name , x1 , y1 , width + 16 , MENU_BUTTON_HEIGHT - 1 , NULL ) ;
2009-01-25 20:22:05 +00:00
y1 - = MENU_BUTTON_HEIGHT ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
else if ( item - > type = = MENU_ITEM_LEVEL_OPNAME_ENUM ) {
but = uiDefIconTextMenuBut ( block , menu_item_enum_opname_menu , NULL , ICON_RIGHTARROW_THIN , item - > name , x1 , y1 , width + 16 , MENU_BUTTON_HEIGHT - 1 , NULL ) ;
2009-01-25 20:22:05 +00:00
/* XXX warning, abuse of func_arg! */
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
but - > poin = ( char * ) but ;
2009-01-25 20:22:05 +00:00
but - > func_arg1 = item - > opname ;
but - > func_arg2 = item - > propname ;
y1 - = MENU_BUTTON_HEIGHT ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
else if ( item - > type = = MENU_ITEM_LEVEL_RNA_ENUM ) {
but = uiDefIconTextMenuBut ( block , menu_item_enum_rna_menu , NULL , ICON_RIGHTARROW_THIN , item - > name , x1 , y1 , width + 16 , MENU_BUTTON_HEIGHT - 1 , NULL ) ;
/* XXX warning, abuse of func_arg! */
but - > poin = ( char * ) but ;
but - > rnapoin = item - > rnapoin ;
but - > func_arg1 = item - > propname ;
y1 - = MENU_BUTTON_HEIGHT ;
}
else if ( item - > type = = MENU_ITEM_OPNAME_BOOL ) {
2009-02-03 10:41:48 +00:00
but = uiDefIconTextButO ( block , BUTM , item - > opname , item - > opcontext , item - > icon , item - > name , x1 , y1 , width + 16 , MENU_BUTTON_HEIGHT - 1 , " " ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
RNA_boolean_set ( uiButGetOperatorPtrRNA ( but ) , item - > propname , item - > boolval ) ;
y1 - = MENU_BUTTON_HEIGHT ;
}
2009-01-25 20:22:05 +00:00
else if ( item - > type = = MENU_ITEM_OPNAME_ENUM ) {
const char * name ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
char bname [ 64 ] ;
name = ui_menu_enumpropname ( item - > opname , item - > propname , item - > enumval ) ;
BLI_strncpy ( bname , name , sizeof ( bname ) ) ;
2009-01-25 20:22:05 +00:00
2009-02-03 10:41:48 +00:00
but = uiDefIconTextButO ( block , BUTM , item - > opname , item - > opcontext , item - > icon , bname , x1 , y1 , width + 16 , MENU_BUTTON_HEIGHT - 1 , " " ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
RNA_enum_set ( uiButGetOperatorPtrRNA ( but ) , item - > propname , item - > enumval ) ;
2009-01-25 20:22:05 +00:00
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
y1 - = MENU_BUTTON_HEIGHT ;
}
2009-02-04 11:52:16 +00:00
else if ( item - > type = = MENU_ITEM_OPNAME_INT ) {
but = uiDefIconTextButO ( block , BUTM , item - > opname , head - > opcontext , item - > icon , item - > name , x1 , y1 , width + 16 , MENU_BUTTON_HEIGHT - 1 , " " ) ;
RNA_int_set ( uiButGetOperatorPtrRNA ( but ) , item - > propname , item - > intval ) ;
y1 - = MENU_BUTTON_HEIGHT ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
else if ( item - > type = = MENU_ITEM_OPNAME_FLOAT ) {
2009-02-03 10:41:48 +00:00
but = uiDefIconTextButO ( block , BUTM , item - > opname , item - > opcontext , item - > icon , item - > name , x1 , y1 , width + 16 , MENU_BUTTON_HEIGHT - 1 , " " ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
RNA_float_set ( uiButGetOperatorPtrRNA ( but ) , item - > propname , item - > fltval ) ;
2009-01-25 20:22:05 +00:00
y1 - = MENU_BUTTON_HEIGHT ;
}
else if ( item - > type = = MENU_ITEM_OPNAME ) {
2009-02-03 10:41:48 +00:00
uiDefIconTextButO ( block , BUTM , item - > opname , item - > opcontext , item - > icon , NULL , x1 , y1 , width + 16 , MENU_BUTTON_HEIGHT - 1 , NULL ) ;
2009-01-25 20:22:05 +00:00
y1 - = MENU_BUTTON_HEIGHT ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
else if ( item - > type = = MENU_ITEM_RNA_BOOL ) {
PropertyRNA * prop = RNA_struct_find_property ( & item - > rnapoin , item - > propname ) ;
if ( prop & & RNA_property_type ( & item - > rnapoin , prop ) = = PROP_BOOLEAN ) {
icon = ( RNA_property_boolean_get ( & item - > rnapoin , prop ) ) ? ICON_CHECKBOX_HLT : ICON_CHECKBOX_DEHLT ;
uiDefIconTextButR ( block , TOG , 0 , icon , NULL , x1 , y1 , width + 16 , MENU_BUTTON_HEIGHT - 1 , & item - > rnapoin , item - > propname , 0 , 0 , 0 , 0 , 0 , NULL ) ;
}
else {
uiBlockSetButLock ( block , 1 , " " ) ;
uiDefIconTextBut ( block , BUT , 0 , ICON_BLANK1 , item - > propname , x1 , y1 , width + 16 , MENU_BUTTON_HEIGHT - 1 , NULL , 0.0 , 0.0 , 0 , 0 , " " ) ;
uiBlockClearButLock ( block ) ;
}
y1 - = MENU_BUTTON_HEIGHT ;
}
else if ( item - > type = = MENU_ITEM_RNA_ENUM ) {
PropertyRNA * prop = RNA_struct_find_property ( & item - > rnapoin , item - > propname ) ;
if ( prop & & RNA_property_type ( & item - > rnapoin , prop ) = = PROP_ENUM ) {
icon = ( RNA_property_enum_get ( & item - > rnapoin , prop ) = = item - > enumval ) ? ICON_CHECKBOX_HLT : ICON_CHECKBOX_DEHLT ;
uiDefIconTextButR ( block , ROW , 0 , icon , NULL , x1 , y1 , width + 16 , MENU_BUTTON_HEIGHT - 1 , & item - > rnapoin , item - > propname , 0 , 0 , item - > enumval , 0 , 0 , NULL ) ;
}
else {
uiBlockSetButLock ( block , 1 , " " ) ;
uiDefIconTextBut ( block , BUT , 0 , ICON_BLANK1 , item - > propname , x1 , y1 , width + 16 , MENU_BUTTON_HEIGHT - 1 , NULL , 0.0 , 0.0 , 0 , 0 , " " ) ;
uiBlockClearButLock ( block ) ;
}
y1 - = MENU_BUTTON_HEIGHT ;
}
else if ( item - > type = = MENU_ITEM_ITEM ) {
2009-01-25 20:22:05 +00:00
uiDefIconTextButF ( block , BUTM , B_NOP , item - > icon , item - > name , x1 , y1 , width + 16 , MENU_BUTTON_HEIGHT - 1 , & handle - > retvalue , 0.0 , 0.0 , 0 , item - > retval , " " ) ;
y1 - = MENU_BUTTON_HEIGHT ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
else {
uiDefBut ( block , SEPR , 0 , " " , x1 , y1 , width + 16 , MENU_SEPR_HEIGHT - 1 , NULL , 0.0 , 0.0 , 0 , 0 , " " ) ;
y1 - = MENU_SEPR_HEIGHT ;
}
2009-01-25 20:22:05 +00:00
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
if ( info - > popup ) {
uiBlockSetFlag ( block , UI_BLOCK_LOOP | UI_BLOCK_REDRAW | UI_BLOCK_NUMSELECT | UI_BLOCK_RET_1 ) ;
uiBlockSetDirection ( block , UI_DOWN ) ;
2009-02-04 11:52:16 +00:00
/* here we set an offset for the mouse position */
uiMenuPopupBoundsBlock ( block , 1 , 0 , - height + MENU_BUTTON_HEIGHT / 2 ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
}
else {
/* for a header menu we set the direction automatic */
if ( ! info - > slideout ) {
sa = CTX_wm_area ( C ) ;
ar = CTX_wm_region ( C ) ;
if ( sa & & sa - > headertype = = HEADERDOWN ) {
if ( ar & & ar - > regiontype = = RGN_TYPE_HEADER ) {
uiBlockSetDirection ( block , UI_TOP ) ;
uiBlockFlipOrder ( block ) ;
}
}
}
uiTextBoundsBlock ( block , 50 ) ;
}
/* if menu slides out of other menu, override direction */
if ( info - > slideout )
uiBlockSetDirection ( block , UI_RIGHT ) ;
2009-01-25 20:22:05 +00:00
uiEndBlock ( C , block ) ;
return block ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiPopupBlockHandle * ui_popup_menu_create ( bContext * C , ARegion * butregion , uiBut * but , uiMenuCreateFunc menu_func , void * arg )
{
uiPopupBlockHandle * handle ;
uiMenuItem * head ;
uiMenuInfo info ;
head = MEM_callocN ( sizeof ( uiMenuItem ) , " menu dummy " ) ;
head - > opcontext = WM_OP_EXEC_REGION_WIN ;
menu_func ( C , head , arg ) ;
memset ( & info , 0 , sizeof ( info ) ) ;
info . head = head ;
info . slideout = ( but & & ( but - > block - > flag & UI_BLOCK_LOOP ) ) ;
handle = ui_popup_block_create ( C , butregion , but , NULL , ui_block_func_MENU_ITEM , & info ) ;
BLI_freelistN ( & head - > items ) ;
MEM_freeN ( head ) ;
return handle ;
}
/*************************** Menu Creating API **************************/
/* internal add func */
static uiMenuItem * ui_menu_add_item ( uiMenuItem * head , const char * name , int icon , int argval )
{
uiMenuItem * item = MEM_callocN ( sizeof ( uiMenuItem ) , " menu item " ) ;
BLI_strncpy ( item - > name , name , MAX_MENU_STR ) ;
if ( icon )
item - > icon = icon ;
else
item - > icon = ICON_BLANK1 ;
item - > retval = argval ;
2009-02-03 10:41:48 +00:00
item - > opcontext = head - > opcontext ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
BLI_addtail ( & head - > items , item ) ;
return item ;
}
/* set callback for regular items */
void uiMenuFunc ( uiMenuItem * head , void ( * eventfunc ) ( bContext * , void * , int ) , void * argv )
{
head - > eventfunc = eventfunc ;
head - > argv = argv ;
}
/* optionally set different context for all items in one level */
void uiMenuContext ( uiMenuItem * head , int opcontext )
{
head - > opcontext = opcontext ;
}
/* regular item, with retval */
void uiMenuItemVal ( uiMenuItem * head , const char * name , int icon , int argval )
{
uiMenuItem * item = ui_menu_add_item ( head , name , icon , argval ) ;
item - > type = MENU_ITEM_ITEM ;
}
/* regular operator item */
2009-02-04 11:52:16 +00:00
void uiMenuItemO ( uiMenuItem * head , int icon , char * opname )
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
{
2009-02-04 11:52:16 +00:00
uiMenuItem * item = ui_menu_add_item ( head , " " , icon , 0 ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
2009-02-04 11:52:16 +00:00
item - > opname = opname ; // static!
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
item - > type = MENU_ITEM_OPNAME ;
}
/* single operator item with property */
2009-02-04 11:52:16 +00:00
void uiMenuItemEnumO ( uiMenuItem * head , int icon , char * opname , char * propname , int value )
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
{
2009-02-04 11:52:16 +00:00
uiMenuItem * item = ui_menu_add_item ( head , " " , icon , 0 ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
item - > opname = opname ; // static!
item - > propname = propname ; // static!
item - > enumval = value ;
item - > type = MENU_ITEM_OPNAME_ENUM ;
}
/* single operator item with property */
2009-02-04 11:52:16 +00:00
void uiMenuItemIntO ( uiMenuItem * head , const char * name , int icon , char * opname , char * propname , int value )
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
{
2009-02-04 11:52:16 +00:00
uiMenuItem * item = ui_menu_add_item ( head , name , icon , 0 ) ;
item - > opname = opname ; // static!
item - > propname = propname ; // static!
item - > intval = value ;
item - > type = MENU_ITEM_OPNAME_INT ;
}
/* single operator item with property */
void uiMenuItemFloatO ( uiMenuItem * head , const char * name , int icon , char * opname , char * propname , float value )
{
uiMenuItem * item = ui_menu_add_item ( head , name , icon , 0 ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
item - > opname = opname ; // static!
item - > propname = propname ; // static!
item - > fltval = value ;
item - > type = MENU_ITEM_OPNAME_FLOAT ;
}
/* single operator item with property */
2009-02-07 01:27:46 +00:00
void uiMenuItemBooleanO ( uiMenuItem * head , const char * name , int icon , char * opname , char * propname , int value )
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
{
2009-02-07 01:27:46 +00:00
uiMenuItem * item = ui_menu_add_item ( head , name , icon , 0 ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
item - > opname = opname ; // static!
item - > propname = propname ; // static!
item - > boolval = value ;
item - > type = MENU_ITEM_OPNAME_BOOL ;
}
/* add all operator items with property */
void uiMenuItemsEnumO ( uiMenuItem * head , char * opname , char * propname )
{
wmOperatorType * ot = WM_operatortype_find ( opname ) ;
PointerRNA ptr ;
PropertyRNA * prop ;
if ( ! ot | | ! ot - > srna )
return ;
RNA_pointer_create ( NULL , ot - > srna , NULL , & ptr ) ;
prop = RNA_struct_find_property ( & ptr , propname ) ;
if ( prop & & RNA_property_type ( & ptr , prop ) = = PROP_ENUM ) {
const EnumPropertyItem * item ;
int totitem , i ;
RNA_property_enum_items ( & ptr , prop , & item , & totitem ) ;
for ( i = 0 ; i < totitem ; i + + )
2009-02-04 11:52:16 +00:00
uiMenuItemEnumO ( head , 0 , opname , propname , item [ i ] . value ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
}
}
/* rna property toggle */
void uiMenuItemBooleanR ( uiMenuItem * head , PointerRNA * ptr , char * propname )
{
uiMenuItem * item = ui_menu_add_item ( head , " " , 0 , 0 ) ;
item - > propname = propname ; // static!
item - > rnapoin = * ptr ;
item - > type = MENU_ITEM_RNA_BOOL ;
}
void uiMenuItemEnumR ( uiMenuItem * head , PointerRNA * ptr , char * propname , int value )
{
uiMenuItem * item = ui_menu_add_item ( head , " " , 0 , 0 ) ;
item - > propname = propname ; // static!
item - > rnapoin = * ptr ;
item - > enumval = value ;
item - > type = MENU_ITEM_RNA_ENUM ;
}
/* add all rna items with property */
void uiMenuItemsEnumR ( uiMenuItem * head , PointerRNA * ptr , char * propname )
{
PropertyRNA * prop ;
prop = RNA_struct_find_property ( ptr , propname ) ;
if ( prop & & RNA_property_type ( ptr , prop ) = = PROP_ENUM ) {
const EnumPropertyItem * item ;
int totitem , i ;
RNA_property_enum_items ( ptr , prop , & item , & totitem ) ;
for ( i = 0 ; i < totitem ; i + + )
uiMenuItemEnumR ( head , ptr , propname , item [ i ] . value ) ;
}
}
/* generic new menu level */
void uiMenuLevel ( uiMenuItem * head , const char * name , uiMenuCreateFunc newlevel )
{
uiMenuItem * item = ui_menu_add_item ( head , name , 0 , 0 ) ;
item - > type = MENU_ITEM_LEVEL ;
item - > newlevel = newlevel ;
}
/* make a new level from enum properties */
void uiMenuLevelEnumO ( uiMenuItem * head , char * opname , char * propname )
{
uiMenuItem * item = ui_menu_add_item ( head , " " , 0 , 0 ) ;
wmOperatorType * ot ;
item - > type = MENU_ITEM_LEVEL_OPNAME_ENUM ;
ot = WM_operatortype_find ( opname ) ;
if ( ot )
BLI_strncpy ( item - > name , ot - > name , MAX_MENU_STR ) ;
item - > opname = opname ; // static!
item - > propname = propname ; // static!
BLI_addtail ( & head - > items , item ) ;
}
/* make a new level from enum properties */
void uiMenuLevelEnumR ( uiMenuItem * head , PointerRNA * ptr , char * propname )
{
uiMenuItem * item = ui_menu_add_item ( head , " " , 0 , 0 ) ;
PropertyRNA * prop ;
item - > type = MENU_ITEM_LEVEL_RNA_ENUM ;
prop = RNA_struct_find_property ( ptr , propname ) ;
if ( prop )
BLI_strncpy ( item - > name , RNA_property_ui_name ( ptr , prop ) , MAX_MENU_STR ) ;
item - > rnapoin = * ptr ;
item - > propname = propname ; // static!
BLI_addtail ( & head - > items , item ) ;
}
/* separator */
void uiMenuSeparator ( uiMenuItem * head )
{
uiMenuItem * item = ui_menu_add_item ( head , " " , 0 , 0 ) ;
item - > type = MENU_ITEM_SEPARATOR ;
}
/*************************** Popup Menu API **************************/
/* only return handler, and set optional title */
2009-02-04 11:52:16 +00:00
uiMenuItem * uiPupMenuBegin ( const char * title , int icon )
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
{
uiMenuItem * item = MEM_callocN ( sizeof ( uiMenuItem ) , " menu start " ) ;
item - > type = MENU_ITEM_TITLE ;
item - > opcontext = WM_OP_EXEC_REGION_WIN ;
2009-02-04 11:52:16 +00:00
item - > icon = icon ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
/* NULL is no title */
if ( title )
BLI_strncpy ( item - > name , title , MAX_MENU_STR ) ;
return item ;
}
/* set the whole structure to work */
void uiPupMenuEnd ( bContext * C , uiMenuItem * head )
{
wmWindow * window = CTX_wm_window ( C ) ;
uiMenuInfo info ;
uiPopupBlockHandle * menu ;
memset ( & info , 0 , sizeof ( info ) ) ;
info . popup = 1 ;
info . mx = window - > eventstate - > x ;
info . my = window - > eventstate - > y ;
info . head = head ;
menu = ui_popup_block_create ( C , NULL , NULL , NULL , ui_block_func_MENU_ITEM , & info ) ;
menu - > popup = 1 ;
UI_add_popup_handlers ( & window - > handlers , menu ) ;
WM_event_add_mousemove ( C ) ;
BLI_freelistN ( & head - > items ) ;
MEM_freeN ( head ) ;
}
/* this one only to be called with operatortype name option */
void uiPupMenu ( bContext * C , int maxrow , uiMenuHandleFunc func , void * arg , char * str , . . . )
{
wmWindow * window = CTX_wm_window ( C ) ;
uiPupMenuInfo info ;
uiPopupBlockHandle * menu ;
memset ( & info , 0 , sizeof ( info ) ) ;
info . mx = window - > eventstate - > x ;
info . my = window - > eventstate - > y ;
info . maxrow = maxrow ;
info . instr = str ;
menu = ui_popup_block_create ( C , NULL , NULL , NULL , ui_block_func_PUPMENU , & info ) ;
menu - > popup = 1 ;
UI_add_popup_handlers ( & window - > handlers , menu ) ;
WM_event_add_mousemove ( C ) ;
menu - > popup_func = func ;
menu - > popup_arg = arg ;
}
/* standard pupmenus */
static void operator_cb ( bContext * C , void * arg , int retval )
{
const char * opname = arg ;
if ( opname & & retval > 0 )
WM_operator_name_call ( C , opname , WM_OP_EXEC_DEFAULT , NULL ) ;
}
static void vconfirm ( bContext * C , char * opname , char * title , char * itemfmt , va_list ap )
{
char * s , buf [ 512 ] ;
s = buf ;
if ( title ) s + = sprintf ( s , " %s%%t| " , title ) ;
vsprintf ( s , itemfmt , ap ) ;
uiPupMenu ( C , 0 , operator_cb , opname , buf ) ;
}
static void confirm ( bContext * C , char * opname , char * title , char * itemfmt , . . . )
{
va_list ap ;
va_start ( ap , itemfmt ) ;
vconfirm ( C , opname , title , itemfmt , ap ) ;
va_end ( ap ) ;
}
void uiPupMenuOkee ( bContext * C , char * opname , char * str , . . . )
{
va_list ap ;
char titlestr [ 256 ] ;
sprintf ( titlestr , " OK? %%i%d " , ICON_HELP ) ;
va_start ( ap , str ) ;
vconfirm ( C , opname , titlestr , str , ap ) ;
va_end ( ap ) ;
}
void uiPupMenuSaveOver ( bContext * C , char * opname , char * filename , . . . )
{
size_t len = strlen ( filename ) ;
if ( len = = 0 )
return ;
if ( BLI_exists ( filename ) = = 0 )
operator_cb ( C , opname , 1 ) ;
if ( filename [ len - 1 ] = = ' / ' | | filename [ len - 1 ] = = ' \\ ' ) {
uiPupMenuError ( C , " Cannot overwrite a directory " ) ;
return ;
}
confirm ( C , opname , " Save over " , filename ) ;
}
void uiPupMenuNotice ( bContext * C , char * str , . . . )
{
va_list ap ;
va_start ( ap , str ) ;
vconfirm ( C , NULL , NULL , str , ap ) ;
va_end ( ap ) ;
}
void uiPupMenuError ( bContext * C , char * str , . . . )
{
va_list ap ;
char nfmt [ 256 ] ;
char titlestr [ 256 ] ;
sprintf ( titlestr , " Error %%i%d " , ICON_ERROR ) ;
sprintf ( nfmt , " %s " , str ) ;
va_start ( ap , str ) ;
vconfirm ( C , NULL , titlestr , nfmt , ap ) ;
va_end ( ap ) ;
}
void uiPupMenuReports ( bContext * C , ReportList * reports )
{
Report * report ;
DynStr * ds ;
char * str ;
if ( ! reports | | ! reports - > list . first )
return ;
if ( ! CTX_wm_window ( C ) )
return ;
ds = BLI_dynstr_new ( ) ;
for ( report = reports - > list . first ; report ; report = report - > next ) {
if ( report - > type > = RPT_ERROR )
BLI_dynstr_appendf ( ds , " Error %%i%d%%t|%s " , ICON_ERROR , report - > message ) ;
else if ( report - > type > = RPT_WARNING )
BLI_dynstr_appendf ( ds , " Warning %%i%d%%t|%s " , ICON_ERROR , report - > message ) ;
}
str = BLI_dynstr_get_cstring ( ds ) ;
uiPupMenu ( C , 0 , NULL , NULL , str ) ;
MEM_freeN ( str ) ;
BLI_dynstr_free ( ds ) ;
}
2009-01-25 20:22:05 +00:00
2009-02-04 11:52:16 +00:00
/*************************** Popup Block API **************************/
void uiPupBlockO ( bContext * C , uiBlockCreateFunc func , void * arg , char * opname , int opcontext )
{
wmWindow * window = CTX_wm_window ( C ) ;
uiPopupBlockHandle * handle ;
handle = ui_popup_block_create ( C , NULL , NULL , func , NULL , arg ) ;
handle - > popup = 1 ;
handle - > opname = opname ;
handle - > opcontext = opcontext ;
UI_add_popup_handlers ( & window - > handlers , handle ) ;
WM_event_add_mousemove ( C ) ;
}
void uiPupBlock ( bContext * C , uiBlockCreateFunc func , void * arg )
{
uiPupBlockO ( C , func , arg , NULL , 0 ) ;
}