2008-12-16 13:19:02 +00:00
/**
* * * * * * BEGIN GPL LICENSE BLOCK * * * * *
*
* This program is free software ; you can redistribute it and / or
* modify it under the terms of the GNU General Public License
* as published by the Free Software Foundation ; either version 2
* of the License , or ( at your option ) any later version .
*
* This program is distributed in the hope that it will be useful ,
* but WITHOUT ANY WARRANTY ; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the
* GNU General Public License for more details .
*
* You should have received a copy of the GNU General Public License
* along with this program ; if not , write to the Free Software Foundation ,
* Inc . , 59 Temple Place - Suite 330 , Boston , MA 02111 - 1307 , USA .
*
* The Original Code is Copyright ( C ) 2008 Blender Foundation .
* All rights reserved .
*
* Contributor ( s ) : Blender Foundation
*
* * * * * * END GPL LICENSE BLOCK * * * * *
*/
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
2008-12-15 19:19:39 +00:00
# include <stdarg.h>
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
# include <stdlib.h>
# include <string.h>
# include "MEM_guardedalloc.h"
# include "DNA_screen_types.h"
2008-12-03 17:11:50 +00:00
# include "DNA_view2d_types.h"
2009-04-10 16:30:28 +00:00
# include "DNA_userdef_types.h"
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
# include "DNA_windowmanager_types.h"
# include "BLI_arithb.h"
# include "BLI_blenlib.h"
2008-12-29 13:38:08 +00:00
# include "BLI_dynstr.h"
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
2008-12-18 02:56:48 +00:00
# include "BKE_context.h"
2009-06-24 16:44:54 +00:00
# include "BKE_icons.h"
2008-12-29 13:38:08 +00:00
# include "BKE_report.h"
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
# include "BKE_screen.h"
2009-05-20 15:20:24 +00:00
# include "BKE_texture.h"
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
# include "BKE_utildefines.h"
# include "WM_api.h"
# include "WM_types.h"
2.5: WM Compositing
* Triple Buffer is now more complete:
- Proper handling of window resize, duplicate, etc.
- It now uses 3x3 textures (or less) if the power of two sizes
do not match well. That still has a worst case wast of 23.4%,
but better than 300%.
- It can also use the ARB/NV/EXT_texture_rectangle extension
now, which may be supported on hardware that does not support
ARB_texture_non_power_of_two.
- Gesture, menu and brushe redraws now require no redraws at all
from the area regions. So even on a high poly scene just moving
the paint cursor or opening a menu should be fast.
* Testing can be done by setting the "Window Draw Method" in the
User Preferences in the outliner. "Overlap" is still default,
since "Triple Buffer" has not been tested on computers other than
mine, would like to avoid crashing Blender on startup in case
there is a common bug, but it's ready for testing now.
- For reference "Full" draws the full window each time.
- "Triple Buffer" should work for both swap copy and swap exchange
systems, the latter still need the -E command line option for
"Overlap".
- Resizing and going fullscreen still gives flicker here but no
more than "Full" drawing.
* Partial Redraw was added. ED_region_tag_redraw_partial takes a
rect in window coordinates to define a subarea of the region.
On region draw it will then set glScissor to a smaller area, and
ar->drawrct will always be set to either the partial or full
window rect. The latter can then be used for clipping in the 3D
view or clipping interface drawing. Neither is implemented yet.
2009-01-23 03:52:52 +00:00
# include "wm_draw.h"
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
# include "wm_subwindow.h"
# include "wm_window.h"
2009-01-25 20:22:05 +00:00
# include "RNA_access.h"
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
# include "BIF_gl.h"
# include "UI_interface.h"
2009-06-05 16:11:18 +00:00
# include "UI_interface_icons.h"
2008-12-03 17:11:50 +00:00
# include "UI_view2d.h"
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
2009-04-10 16:30:28 +00:00
# include "BLF_api.h"
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
# include "ED_screen.h"
2008-12-26 13:11:04 +00:00
# include "interface_intern.h"
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
# define MENU_BUTTON_HEIGHT 20
2009-01-04 00:05:40 +00:00
# define MENU_SEPR_HEIGHT 6
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
# define B_NOP -1
2009-04-15 14:43:54 +00:00
# define MENU_SHADOW_SIDE 8
# define MENU_SHADOW_BOTTOM 10
# define MENU_TOP 8
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
/*********************** Menu Data Parsing ********************* */
typedef struct {
char * str ;
int retval ;
int icon ;
} MenuEntry ;
typedef struct {
char * instr ;
char * title ;
int titleicon ;
MenuEntry * items ;
int nitems , itemssize ;
} MenuData ;
static MenuData * menudata_new ( char * instr )
{
MenuData * md = MEM_mallocN ( sizeof ( * md ) , " MenuData " ) ;
md - > instr = instr ;
md - > title = NULL ;
md - > titleicon = 0 ;
md - > items = NULL ;
md - > nitems = md - > itemssize = 0 ;
return md ;
}
static void menudata_set_title ( MenuData * md , char * title , int titleicon )
{
if ( ! md - > title )
md - > title = title ;
if ( ! md - > titleicon )
md - > titleicon = titleicon ;
}
static void menudata_add_item ( MenuData * md , char * str , int retval , int icon )
{
if ( md - > nitems = = md - > itemssize ) {
int nsize = md - > itemssize ? ( md - > itemssize < < 1 ) : 1 ;
MenuEntry * oitems = md - > items ;
md - > items = MEM_mallocN ( nsize * sizeof ( * md - > items ) , " md->items " ) ;
if ( oitems ) {
memcpy ( md - > items , oitems , md - > nitems * sizeof ( * md - > items ) ) ;
MEM_freeN ( oitems ) ;
}
md - > itemssize = nsize ;
}
md - > items [ md - > nitems ] . str = str ;
md - > items [ md - > nitems ] . retval = retval ;
md - > items [ md - > nitems ] . icon = icon ;
md - > nitems + + ;
}
void menudata_free ( MenuData * md )
{
MEM_freeN ( md - > instr ) ;
if ( md - > items )
MEM_freeN ( md - > items ) ;
MEM_freeN ( md ) ;
}
/**
* Parse menu description strings , string is of the
* form " [sss%t|]{(sss[%xNN]|), (%l|)} " , ssss % t indicates the
* menu title , sss or sss % xNN indicates an option ,
* if % xNN is given then NN is the return value if
* that option is selected otherwise the return value
* is the index of the option ( starting with 1 ) . % l
* indicates a seperator .
*
* @ param str String to be parsed .
* @ retval new menudata structure , free with menudata_free ( )
*/
MenuData * decompose_menu_string ( char * str )
{
char * instr = BLI_strdup ( str ) ;
MenuData * md = menudata_new ( instr ) ;
char * nitem = NULL , * s = instr ;
int nicon = 0 , nretval = 1 , nitem_is_title = 0 ;
while ( 1 ) {
char c = * s ;
if ( c = = ' % ' ) {
if ( s [ 1 ] = = ' x ' ) {
nretval = atoi ( s + 2 ) ;
* s = ' \0 ' ;
s + + ;
} else if ( s [ 1 ] = = ' t ' ) {
nitem_is_title = 1 ;
* s = ' \0 ' ;
s + + ;
} else if ( s [ 1 ] = = ' l ' ) {
nitem = " %l " ;
s + + ;
} else if ( s [ 1 ] = = ' i ' ) {
nicon = atoi ( s + 2 ) ;
* s = ' \0 ' ;
s + + ;
}
2009-06-24 14:16:56 +00:00
} else if ( c = = ' | ' | | c = = ' \n ' | | c = = ' \0 ' ) {
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
if ( nitem ) {
* s = ' \0 ' ;
if ( nitem_is_title ) {
menudata_set_title ( md , nitem , nicon ) ;
nitem_is_title = 0 ;
} else {
/* prevent separator to get a value */
if ( nitem [ 0 ] = = ' % ' & & nitem [ 1 ] = = ' l ' )
menudata_add_item ( md , nitem , - 1 , nicon ) ;
else
menudata_add_item ( md , nitem , nretval , nicon ) ;
nretval = md - > nitems + 1 ;
}
nitem = NULL ;
nicon = 0 ;
}
if ( c = = ' \0 ' )
break ;
} else if ( ! nitem )
nitem = s ;
s + + ;
}
return md ;
}
void ui_set_name_menu ( uiBut * but , int value )
{
MenuData * md ;
int i ;
md = decompose_menu_string ( but - > str ) ;
for ( i = 0 ; i < md - > nitems ; i + + )
if ( md - > items [ i ] . retval = = value )
strcpy ( but - > drawstr , md - > items [ i ] . str ) ;
menudata_free ( md ) ;
}
2009-06-06 13:35:04 +00:00
int ui_step_name_menu ( uiBut * but , int step )
{
MenuData * md ;
int value = ui_get_but_val ( but ) ;
int i ;
md = decompose_menu_string ( but - > str ) ;
for ( i = 0 ; i < md - > nitems ; i + + )
if ( md - > items [ i ] . retval = = value )
break ;
if ( step = = 1 ) {
/* skip separators */
for ( ; i < md - > nitems - 1 ; i + + ) {
if ( md - > items [ i + 1 ] . retval ! = - 1 ) {
value = md - > items [ i + 1 ] . retval ;
break ;
}
}
}
else {
if ( i > 0 ) {
/* skip separators */
for ( ; i > 0 ; i - - ) {
if ( md - > items [ i - 1 ] . retval ! = - 1 ) {
value = md - > items [ i - 1 ] . retval ;
break ;
}
}
}
}
menudata_free ( md ) ;
return value ;
}
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
/******************** Creating Temporary regions ******************/
ARegion * ui_add_temporary_region ( bScreen * sc )
{
ARegion * ar ;
ar = MEM_callocN ( sizeof ( ARegion ) , " area region " ) ;
BLI_addtail ( & sc - > regionbase , ar ) ;
ar - > regiontype = RGN_TYPE_TEMPORARY ;
ar - > alignment = RGN_ALIGN_FLOAT ;
return ar ;
}
void ui_remove_temporary_region ( bContext * C , bScreen * sc , ARegion * ar )
{
2009-02-09 20:58:31 +00:00
if ( CTX_wm_window ( C ) )
wm_draw_region_clear ( CTX_wm_window ( C ) , ar ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
ED_region_exit ( C , ar ) ;
2.5
View3D has been split now in a local part (RegionView3D) and a
per-area part (old View3D). Currently local is:
- view transform
- camera zoom/offset
- gpencil (todo)
- custom clipping planes
Rest is in Area still, like active camera, draw type, layers,
localview, custom centers, around-settings, transform widget,
gridlines, and so on (mostly stuff as available in header).
To see it work; also added new feature for region split,
press SHIFT+ALT+CTRL+S for four-split.
The idea is to make a preset 4-split, configured to stick
to top/right/front views for three views.
Another cool idea to explore is to then box-clip all drawing
based on these 3 views.
Note about the code:
- currently view3d still stores some depricated settings, to
convert from older files. Not all settings are copied over
though, like custom clip planes or the 'lock view to object'.
- since some view3d ops are now on area level, the operators
for it should keep track of that.
Bugfix in transform: quat initialize in operator-invoke missed
one zero.
Als brought back GE to compile for missing Ipos and channels.
2009-01-19 16:54:41 +00:00
BKE_area_region_free ( NULL , ar ) ; /* NULL: no spacetype */
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
BLI_freelinkN ( & sc - > regionbase , ar ) ;
}
/************************* Creating Tooltips **********************/
typedef struct uiTooltipData {
2009-04-15 14:43:54 +00:00
rcti bbox ;
uiFontStyle fstyle ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
char * tip ;
} uiTooltipData ;
static void ui_tooltip_region_draw ( const bContext * C , ARegion * ar )
{
2009-04-15 14:43:54 +00:00
uiTooltipData * data = ar - > regiondata ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
2009-04-15 14:43:54 +00:00
ui_draw_menu_back ( U . uistyles . first , NULL , & data - > bbox ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
/* draw text */
2009-03-09 08:31:45 +00:00
glColor4f ( 1.0f , 1.0f , 1.0f , 1.0f ) ;
2009-04-15 14:43:54 +00:00
uiStyleFontSet ( & data - > fstyle ) ;
uiStyleFontDraw ( & data - > fstyle , & data - > bbox , data - > tip ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
}
static void ui_tooltip_region_free ( ARegion * ar )
{
uiTooltipData * data ;
data = ar - > regiondata ;
MEM_freeN ( data - > tip ) ;
MEM_freeN ( data ) ;
2.5
Added WM Jobs manager
- WM can manage threaded jobs for you; just provide a couple
of components to get it work:
- customdata, free callback for it
- timer step, notifier code
- start callback, update callback
- Once started, each job runs an own timer, and will for
every time step check necessary updates, or close the
job when ready.
- No drawing happens in jobs, that's for notifiers!
- Every job stores an owner pointer, and based on this owner
it will prevent multiple jobs to enter the stack.
Instead it will re-use a running job, signal it to stop
and allow caller to re-initialize it even.
- Check new wm_jobs.c for more explanation. Jobs API is still
under construction.
Fun: BLI_addtail(&wm->jobs, steve); :)
Put Node shader previews back using wmJobs
- Preview calculating is now fully threaded (1 thread still)
- Thanks to new event system + notifiers, you can see
previews update even while dragging sliders!
- Currently it only starts when you change a node setting.
Warning: the thread render shares Node data, so don't delete
nodes while it renders! This topic is on the todo to make safe.
Also:
- bug in region initialize (do_versions) showed channel list in
node editor wrong.
- flagged the channel list 'hidden' now, it was really in the
way! This is for later to work on anyway.
- recoded Render API callbacks so it gets handlers passed on,
no globals to use anymore, remember?
- previewrender code gets now so much nicer! Will remove a lot
of stuff from code soon.
2009-01-22 14:59:49 +00:00
ar - > regiondata = NULL ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
}
ARegion * ui_tooltip_create ( bContext * C , ARegion * butregion , uiBut * but )
{
2009-04-10 16:30:28 +00:00
uiStyle * style = U . uistyles . first ; // XXX pass on as arg
2008-12-08 15:02:57 +00:00
static ARegionType type ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
ARegion * ar ;
uiTooltipData * data ;
2009-04-15 14:43:54 +00:00
float fonth , fontw , aspect = but - > block - > aspect ;
float x1f , x2f , y1f , y2f ;
2008-12-26 13:11:04 +00:00
int x1 , x2 , y1 , y2 , winx , winy , ofsx , ofsy ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
if ( ! but - > tip | | strlen ( but - > tip ) = = 0 )
return NULL ;
/* create area region */
2008-12-18 02:56:48 +00:00
ar = ui_add_temporary_region ( CTX_wm_screen ( C ) ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
2008-12-08 15:02:57 +00:00
memset ( & type , 0 , sizeof ( ARegionType ) ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
type . draw = ui_tooltip_region_draw ;
type . free = ui_tooltip_region_free ;
ar - > type = & type ;
/* create tooltip data */
data = MEM_callocN ( sizeof ( uiTooltipData ) , " uiTooltipData " ) ;
data - > tip = BLI_strdup ( but - > tip ) ;
2009-04-10 16:30:28 +00:00
/* set font, get bb */
2009-04-15 14:43:54 +00:00
data - > fstyle = style - > widget ; /* copy struct */
data - > fstyle . align = UI_STYLE_TEXT_CENTER ;
ui_fontscale ( & data - > fstyle . points , aspect ) ;
uiStyleFontSet ( & data - > fstyle ) ;
fontw = aspect * BLF_width ( data - > tip ) ;
fonth = aspect * BLF_height ( data - > tip ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
ar - > regiondata = data ;
/* compute position */
2008-12-26 13:11:04 +00:00
ofsx = ( but - > block - > panel ) ? but - > block - > panel - > ofsx : 0 ;
ofsy = ( but - > block - > panel ) ? but - > block - > panel - > ofsy : 0 ;
2009-04-15 14:43:54 +00:00
x1f = ( but - > x1 + but - > x2 ) / 2.0f + ofsx - 16.0f * aspect ;
x2f = x1f + fontw + 16.0f * aspect ;
y2f = but - > y1 + ofsy - 15.0f * aspect ;
y1f = y2f - fonth - 10.0f * aspect ;
/* copy to int, gets projected if possible too */
x1 = x1f ; y1 = y1f ; x2 = x2f ; y2 = y2f ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
if ( butregion ) {
2008-12-03 17:11:50 +00:00
/* XXX temp, region v2ds can be empty still */
if ( butregion - > v2d . cur . xmin ! = butregion - > v2d . cur . xmax ) {
2009-04-15 14:43:54 +00:00
UI_view2d_to_region_no_clip ( & butregion - > v2d , x1f , y1f , & x1 , & y1 ) ;
UI_view2d_to_region_no_clip ( & butregion - > v2d , x2f , y2f , & x2 , & y2 ) ;
2008-12-03 17:11:50 +00:00
}
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
x1 + = butregion - > winrct . xmin ;
x2 + = butregion - > winrct . xmin ;
y1 + = butregion - > winrct . ymin ;
y2 + = butregion - > winrct . ymin ;
}
2008-12-18 02:56:48 +00:00
wm_window_get_size ( CTX_wm_window ( C ) , & winx , & winy ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
if ( x2 > winx ) {
2008-12-03 17:11:50 +00:00
/* super size */
if ( x2 > winx + x1 ) {
x2 = winx ;
x1 = 0 ;
}
else {
x1 - = x2 - winx ;
x2 = winx ;
}
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
}
if ( y1 < 0 ) {
2009-06-06 13:35:04 +00:00
y1 + = 56 * aspect ;
y2 + = 56 * aspect ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
}
2009-04-15 14:43:54 +00:00
/* widget rect, in region coords */
data - > bbox . xmin = MENU_SHADOW_SIDE ;
data - > bbox . xmax = x2 - x1 + MENU_SHADOW_SIDE ;
data - > bbox . ymin = MENU_SHADOW_BOTTOM ;
data - > bbox . ymax = y2 - y1 + MENU_SHADOW_BOTTOM ;
2008-12-03 17:11:50 +00:00
2009-04-15 14:43:54 +00:00
/* region bigger for shadow */
ar - > winrct . xmin = x1 - MENU_SHADOW_SIDE ;
ar - > winrct . xmax = x2 + MENU_SHADOW_SIDE ;
ar - > winrct . ymin = y1 - MENU_SHADOW_BOTTOM ;
ar - > winrct . ymax = y2 + MENU_TOP ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
2008-12-10 13:56:54 +00:00
/* adds subwindow */
ED_region_init ( C , ar ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
/* notify change and redraw */
2008-12-16 12:28:00 +00:00
ED_region_tag_redraw ( ar ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
return ar ;
}
void ui_tooltip_free ( bContext * C , ARegion * ar )
{
2008-12-18 02:56:48 +00:00
ui_remove_temporary_region ( C , CTX_wm_screen ( C ) , ar ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
}
2009-06-02 18:10:06 +00:00
/************************* Creating Search Box **********************/
2009-06-03 18:31:37 +00:00
struct uiSearchItems {
int maxitem , totitem , maxstrlen ;
2009-06-05 16:11:18 +00:00
int offset , offset_i ; /* offset for inserting in array */
int more ; /* flag indicating there are more items */
2009-06-03 18:31:37 +00:00
char * * names ;
void * * pointers ;
2009-06-24 16:44:54 +00:00
int * icons ;
2009-06-27 01:15:31 +00:00
AutoComplete * autocpl ;
2009-06-29 11:29:52 +00:00
void * active ;
2009-06-03 18:31:37 +00:00
} ;
2009-06-02 18:10:06 +00:00
typedef struct uiSearchboxData {
rcti bbox ;
uiFontStyle fstyle ;
uiSearchItems items ;
2009-06-05 16:11:18 +00:00
int active ; /* index in items array */
int noback ; /* when menu opened with enough space for this */
2009-06-02 18:10:06 +00:00
} uiSearchboxData ;
# define SEARCH_ITEMS 10
2009-06-03 18:31:37 +00:00
/* exported for use by search callbacks */
/* returns zero if nothing to add */
2009-06-24 16:44:54 +00:00
int uiSearchItemAdd ( uiSearchItems * items , const char * name , void * poin , int iconid )
2009-06-03 18:31:37 +00:00
{
2009-06-27 01:15:31 +00:00
/* hijack for autocomplete */
if ( items - > autocpl ) {
autocomplete_do_name ( items - > autocpl , name ) ;
return 1 ;
}
2009-06-03 18:31:37 +00:00
2009-06-29 11:29:52 +00:00
/* hijack for finding active item */
if ( items - > active ) {
if ( poin = = items - > active )
items - > offset_i = items - > totitem ;
items - > totitem + + ;
return 1 ;
}
2009-06-05 16:11:18 +00:00
if ( items - > totitem > = items - > maxitem ) {
items - > more = 1 ;
2009-06-03 18:31:37 +00:00
return 0 ;
2009-06-05 16:11:18 +00:00
}
/* skip first items in list */
if ( items - > offset_i > 0 ) {
items - > offset_i - - ;
return 1 ;
}
2009-06-03 18:31:37 +00:00
BLI_strncpy ( items - > names [ items - > totitem ] , name , items - > maxstrlen ) ;
items - > pointers [ items - > totitem ] = poin ;
2009-06-24 16:44:54 +00:00
items - > icons [ items - > totitem ] = iconid ;
2009-06-03 18:31:37 +00:00
items - > totitem + + ;
2009-06-05 16:11:18 +00:00
return 1 ;
2009-06-03 18:31:37 +00:00
}
2009-06-05 16:11:18 +00:00
int uiSearchBoxhHeight ( void )
{
return SEARCH_ITEMS * MENU_BUTTON_HEIGHT + 2 * MENU_TOP ;
}
2009-06-03 18:31:37 +00:00
/* ar is the search box itself */
2009-06-05 16:11:18 +00:00
static void ui_searchbox_select ( bContext * C , ARegion * ar , uiBut * but , int step )
2009-06-03 18:31:37 +00:00
{
uiSearchboxData * data = ar - > regiondata ;
/* apply step */
data - > active + = step ;
if ( data - > items . totitem = = 0 )
data - > active = 0 ;
2009-06-05 16:11:18 +00:00
else if ( data - > active > data - > items . totitem ) {
if ( data - > items . more ) {
data - > items . offset + + ;
data - > active = data - > items . totitem ;
ui_searchbox_update ( C , ar , but , 0 ) ;
}
else
data - > active = data - > items . totitem ;
}
else if ( data - > active < 1 ) {
if ( data - > items . offset ) {
data - > items . offset - - ;
data - > active = 1 ;
ui_searchbox_update ( C , ar , but , 0 ) ;
}
else if ( data - > active < 0 )
data - > active = 0 ;
}
2009-06-03 18:31:37 +00:00
ED_region_tag_redraw ( ar ) ;
}
static void ui_searchbox_butrect ( rcti * rect , uiSearchboxData * data , int itemnr )
{
2009-06-05 16:11:18 +00:00
int buth = ( data - > bbox . ymax - data - > bbox . ymin - 2 * MENU_TOP ) / SEARCH_ITEMS ;
2009-06-03 18:31:37 +00:00
* rect = data - > bbox ;
rect - > xmin = data - > bbox . xmin + 3.0f ;
rect - > xmax = data - > bbox . xmax - 3.0f ;
2009-06-05 16:11:18 +00:00
rect - > ymax = data - > bbox . ymax - MENU_TOP - itemnr * buth ;
2009-06-03 18:31:37 +00:00
rect - > ymin = rect - > ymax - buth ;
}
2009-06-11 17:21:27 +00:00
/* x and y in screencoords */
int ui_searchbox_inside ( ARegion * ar , int x , int y )
{
uiSearchboxData * data = ar - > regiondata ;
return ( BLI_in_rcti ( & data - > bbox , x - ar - > winrct . xmin , y - ar - > winrct . ymin ) ) ;
}
2009-06-03 18:31:37 +00:00
/* string validated to be of correct length (but->hardmax) */
void ui_searchbox_apply ( uiBut * but , ARegion * ar )
{
uiSearchboxData * data = ar - > regiondata ;
but - > func_arg2 = NULL ;
if ( data - > active ) {
char * name = data - > items . names [ data - > active - 1 ] ;
char * cpoin = strchr ( name , ' | ' ) ;
if ( cpoin ) cpoin [ 0 ] = 0 ;
BLI_strncpy ( but - > editstr , name , data - > items . maxstrlen ) ;
if ( cpoin ) cpoin [ 0 ] = ' | ' ;
but - > func_arg2 = data - > items . pointers [ data - > active - 1 ] ;
}
}
2009-06-05 16:11:18 +00:00
void ui_searchbox_event ( bContext * C , ARegion * ar , uiBut * but , wmEvent * event )
2009-06-03 18:31:37 +00:00
{
uiSearchboxData * data = ar - > regiondata ;
switch ( event - > type ) {
2009-06-06 13:35:04 +00:00
case WHEELUPMOUSE :
2009-06-03 18:31:37 +00:00
case UPARROWKEY :
2009-06-05 16:11:18 +00:00
ui_searchbox_select ( C , ar , but , - 1 ) ;
2009-06-03 18:31:37 +00:00
break ;
2009-06-06 13:35:04 +00:00
case WHEELDOWNMOUSE :
2009-06-03 18:31:37 +00:00
case DOWNARROWKEY :
2009-06-05 16:11:18 +00:00
ui_searchbox_select ( C , ar , but , 1 ) ;
2009-06-03 18:31:37 +00:00
break ;
case MOUSEMOVE :
if ( BLI_in_rcti ( & ar - > winrct , event - > x , event - > y ) ) {
rcti rect ;
int a ;
for ( a = 0 ; a < data - > items . totitem ; a + + ) {
ui_searchbox_butrect ( & rect , data , a ) ;
if ( BLI_in_rcti ( & rect , event - > x - ar - > winrct . xmin , event - > y - ar - > winrct . ymin ) ) {
if ( data - > active ! = a + 1 ) {
data - > active = a + 1 ;
2009-06-05 16:11:18 +00:00
ui_searchbox_select ( C , ar , but , 0 ) ;
2009-06-03 18:31:37 +00:00
break ;
}
}
}
}
break ;
}
}
2009-06-02 18:10:06 +00:00
/* ar is the search box itself */
2009-06-05 16:11:18 +00:00
void ui_searchbox_update ( bContext * C , ARegion * ar , uiBut * but , int reset )
2009-06-02 18:10:06 +00:00
{
uiSearchboxData * data = ar - > regiondata ;
2009-06-05 16:11:18 +00:00
/* reset vars */
2009-06-02 18:10:06 +00:00
data - > items . totitem = 0 ;
2009-06-05 16:11:18 +00:00
data - > items . more = 0 ;
2009-06-29 11:29:52 +00:00
if ( reset = = 0 ) {
2009-06-05 16:11:18 +00:00
data - > items . offset_i = data - > items . offset ;
2009-06-29 11:29:52 +00:00
}
2009-06-05 16:11:18 +00:00
else {
data - > items . offset_i = data - > items . offset = 0 ;
data - > active = 0 ;
2009-06-29 11:29:52 +00:00
/* handle active */
if ( but - > search_func & & but - > func_arg2 ) {
data - > items . active = but - > func_arg2 ;
but - > search_func ( C , but - > search_arg , but - > editstr , & data - > items ) ;
data - > items . active = NULL ;
/* found active item, calculate real offset by centering it */
if ( data - > items . totitem ) {
/* first case, begin of list */
if ( data - > items . offset_i < data - > items . maxitem ) {
data - > active = data - > items . offset_i + 1 ;
data - > items . offset_i = 0 ;
}
else {
/* second case, end of list */
if ( data - > items . totitem - data - > items . offset_i < = data - > items . maxitem ) {
data - > active = 1 + data - > items . offset_i - data - > items . totitem + data - > items . maxitem ;
data - > items . offset_i = data - > items . totitem - data - > items . maxitem ;
}
else {
/* center active item */
data - > items . offset_i - = data - > items . maxitem / 2 ;
data - > active = 1 + data - > items . maxitem / 2 ;
}
}
}
data - > items . offset = data - > items . offset_i ;
data - > items . totitem = 0 ;
}
2009-06-05 16:11:18 +00:00
}
/* callback */
2009-06-03 18:31:37 +00:00
if ( but - > search_func )
but - > search_func ( C , but - > search_arg , but - > editstr , & data - > items ) ;
2009-06-29 11:29:52 +00:00
/* handle case where editstr is equal to one of items */
if ( reset & & data - > active = = 0 ) {
2009-06-05 16:11:18 +00:00
int a ;
2009-06-29 11:29:52 +00:00
2009-06-05 16:11:18 +00:00
for ( a = 0 ; a < data - > items . totitem ; a + + ) {
char * cpoin = strchr ( data - > items . names [ a ] , ' | ' ) ;
if ( cpoin ) cpoin [ 0 ] = 0 ;
if ( 0 = = strcmp ( but - > editstr , data - > items . names [ a ] ) )
data - > active = a + 1 ;
if ( cpoin ) cpoin [ 0 ] = ' | ' ;
}
2009-06-10 11:43:21 +00:00
if ( data - > items . totitem = = 1 )
data - > active = 1 ;
2009-06-05 16:11:18 +00:00
}
2009-06-03 18:31:37 +00:00
/* validate selected item */
2009-06-05 16:11:18 +00:00
ui_searchbox_select ( C , ar , but , 0 ) ;
2009-06-02 18:10:06 +00:00
ED_region_tag_redraw ( ar ) ;
}
2009-06-27 01:15:31 +00:00
void ui_searchbox_autocomplete ( bContext * C , ARegion * ar , uiBut * but , char * str )
{
uiSearchboxData * data = ar - > regiondata ;
data - > items . autocpl = autocomplete_begin ( str , ui_get_but_string_max_length ( but ) ) ;
but - > search_func ( C , but - > search_arg , but - > editstr , & data - > items ) ;
autocomplete_end ( data - > items . autocpl , str ) ;
data - > items . autocpl = NULL ;
}
2009-06-02 18:10:06 +00:00
static void ui_searchbox_region_draw ( const bContext * C , ARegion * ar )
{
uiSearchboxData * data = ar - > regiondata ;
2009-06-03 18:31:37 +00:00
/* pixel space */
wmOrtho2 ( - 0.01f , ar - > winx - 0.01f , - 0.01f , ar - > winy - 0.01f ) ;
2009-06-05 16:11:18 +00:00
if ( ! data - > noback )
ui_draw_search_back ( U . uistyles . first , NULL , & data - > bbox ) ;
2009-06-02 18:10:06 +00:00
/* draw text */
if ( data - > items . totitem ) {
rcti rect ;
2009-06-03 18:31:37 +00:00
int a ;
2009-06-02 18:10:06 +00:00
/* draw items */
for ( a = 0 ; a < data - > items . totitem ; a + + ) {
2009-06-03 18:31:37 +00:00
ui_searchbox_butrect ( & rect , data , a ) ;
2009-06-24 16:44:54 +00:00
/* widget itself */
ui_draw_menu_item ( & data - > fstyle , & rect , data - > items . names [ a ] , data - > items . icons [ a ] , ( a + 1 ) = = data - > active ? UI_ACTIVE : 0 ) ;
2009-06-05 16:11:18 +00:00
}
/* indicate more */
if ( data - > items . more ) {
2009-06-29 11:29:52 +00:00
ui_searchbox_butrect ( & rect , data , data - > items . maxitem - 1 ) ;
2009-06-05 16:11:18 +00:00
glEnable ( GL_BLEND ) ;
2009-06-29 11:29:52 +00:00
UI_icon_draw ( ( rect . xmax - rect . xmin ) / 2 , rect . ymin - 9 , ICON_TRIA_DOWN ) ;
2009-06-05 16:11:18 +00:00
glDisable ( GL_BLEND ) ;
}
if ( data - > items . offset ) {
2009-06-29 11:29:52 +00:00
ui_searchbox_butrect ( & rect , data , 0 ) ;
2009-06-05 16:11:18 +00:00
glEnable ( GL_BLEND ) ;
2009-06-29 11:29:52 +00:00
UI_icon_draw ( ( rect . xmax - rect . xmin ) / 2 , rect . ymax - 7 , ICON_TRIA_UP ) ;
2009-06-05 16:11:18 +00:00
glDisable ( GL_BLEND ) ;
2009-06-02 18:10:06 +00:00
}
}
}
static void ui_searchbox_region_free ( ARegion * ar )
{
uiSearchboxData * data = ar - > regiondata ;
int a ;
/* free search data */
for ( a = 0 ; a < SEARCH_ITEMS ; a + + )
MEM_freeN ( data - > items . names [ a ] ) ;
MEM_freeN ( data - > items . names ) ;
MEM_freeN ( data - > items . pointers ) ;
2009-06-24 16:44:54 +00:00
MEM_freeN ( data - > items . icons ) ;
2009-06-02 18:10:06 +00:00
MEM_freeN ( data ) ;
ar - > regiondata = NULL ;
}
ARegion * ui_searchbox_create ( bContext * C , ARegion * butregion , uiBut * but )
{
uiStyle * style = U . uistyles . first ; // XXX pass on as arg
static ARegionType type ;
ARegion * ar ;
uiSearchboxData * data ;
float aspect = but - > block - > aspect ;
float x1f , x2f , y1f , y2f ;
2009-06-27 01:15:31 +00:00
int x1 , x2 , y1 , y2 , winx , winy , ofsx , ofsy ;
2009-06-02 18:10:06 +00:00
/* create area region */
ar = ui_add_temporary_region ( CTX_wm_screen ( C ) ) ;
memset ( & type , 0 , sizeof ( ARegionType ) ) ;
type . draw = ui_searchbox_region_draw ;
type . free = ui_searchbox_region_free ;
ar - > type = & type ;
/* create searchbox data */
data = MEM_callocN ( sizeof ( uiSearchboxData ) , " uiSearchboxData " ) ;
/* set font, get bb */
data - > fstyle = style - > widget ; /* copy struct */
data - > fstyle . align = UI_STYLE_TEXT_CENTER ;
ui_fontscale ( & data - > fstyle . points , aspect ) ;
uiStyleFontSet ( & data - > fstyle ) ;
ar - > regiondata = data ;
2009-06-05 16:11:18 +00:00
/* special case, hardcoded feature, not draw backdrop when called from menus,
assume for design that popup already added it */
if ( but - > block - > flag & UI_BLOCK_LOOP )
data - > noback = 1 ;
2009-06-02 18:10:06 +00:00
/* compute position */
2009-06-10 11:43:21 +00:00
if ( but - > block - > flag & UI_BLOCK_LOOP ) {
2009-06-11 17:21:27 +00:00
/* this case is search menu inside other menu */
/* we copy region size */
ar - > winrct = butregion - > winrct ;
/* widget rect, in region coords */
data - > bbox . xmin = MENU_SHADOW_SIDE ;
data - > bbox . xmax = ( ar - > winrct . xmax - ar - > winrct . xmin ) - MENU_SHADOW_SIDE ;
data - > bbox . ymin = MENU_SHADOW_BOTTOM ;
data - > bbox . ymax = ( ar - > winrct . ymax - ar - > winrct . ymin ) - MENU_SHADOW_BOTTOM ;
2009-06-10 11:43:21 +00:00
/* check if button is lower half */
if ( but - > y2 < ( but - > block - > minx + but - > block - > maxx ) / 2 ) {
2009-06-11 17:21:27 +00:00
data - > bbox . ymin + = ( but - > y2 - but - > y1 ) ;
2009-06-10 11:43:21 +00:00
}
else {
2009-06-11 17:21:27 +00:00
data - > bbox . ymax - = ( but - > y2 - but - > y1 ) ;
2009-06-10 11:43:21 +00:00
}
}
else {
2009-06-11 17:21:27 +00:00
x1f = but - > x1 - 5 ; /* align text with button */
x2f = but - > x2 + 5 ; /* symmetrical */
2009-06-10 11:43:21 +00:00
y2f = but - > y1 ;
y1f = y2f - uiSearchBoxhHeight ( ) ;
2009-06-27 01:15:31 +00:00
ofsx = ( but - > block - > panel ) ? but - > block - > panel - > ofsx : 0 ;
ofsy = ( but - > block - > panel ) ? but - > block - > panel - > ofsy : 0 ;
x1f + = ofsx ;
x2f + = ofsx ;
y1f + = ofsy ;
y2f + = ofsy ;
2009-06-02 18:10:06 +00:00
2009-06-11 17:21:27 +00:00
/* minimal width */
if ( x2f - x1f < 150 ) x2f = x1f + 150 ; // XXX arbitrary
/* copy to int, gets projected if possible too */
x1 = x1f ; y1 = y1f ; x2 = x2f ; y2 = y2f ;
if ( butregion ) {
if ( butregion - > v2d . cur . xmin ! = butregion - > v2d . cur . xmax ) {
UI_view2d_to_region_no_clip ( & butregion - > v2d , x1f , y1f , & x1 , & y1 ) ;
UI_view2d_to_region_no_clip ( & butregion - > v2d , x2f , y2f , & x2 , & y2 ) ;
}
x1 + = butregion - > winrct . xmin ;
x2 + = butregion - > winrct . xmin ;
y1 + = butregion - > winrct . ymin ;
y2 + = butregion - > winrct . ymin ;
2009-06-02 18:10:06 +00:00
}
2009-06-11 17:21:27 +00:00
wm_window_get_size ( CTX_wm_window ( C ) , & winx , & winy ) ;
if ( x2 > winx ) {
/* super size */
if ( x2 > winx + x1 ) {
x2 = winx ;
x1 = 0 ;
}
else {
x1 - = x2 - winx ;
x2 = winx ;
}
2009-06-02 18:10:06 +00:00
}
2009-06-11 17:21:27 +00:00
if ( y1 < 0 ) {
y1 + = 36 ;
y2 + = 36 ;
2009-06-02 18:10:06 +00:00
}
2009-06-11 17:21:27 +00:00
/* widget rect, in region coords */
data - > bbox . xmin = MENU_SHADOW_SIDE ;
data - > bbox . xmax = x2 - x1 + MENU_SHADOW_SIDE ;
data - > bbox . ymin = MENU_SHADOW_BOTTOM ;
data - > bbox . ymax = y2 - y1 + MENU_SHADOW_BOTTOM ;
/* region bigger for shadow */
ar - > winrct . xmin = x1 - MENU_SHADOW_SIDE ;
ar - > winrct . xmax = x2 + MENU_SHADOW_SIDE ;
ar - > winrct . ymin = y1 - MENU_SHADOW_BOTTOM ;
ar - > winrct . ymax = y2 ;
2009-06-02 18:10:06 +00:00
}
/* adds subwindow */
ED_region_init ( C , ar ) ;
/* notify change and redraw */
ED_region_tag_redraw ( ar ) ;
/* prepare search data */
data - > items . maxitem = SEARCH_ITEMS ;
data - > items . maxstrlen = but - > hardmax ;
data - > items . totitem = 0 ;
data - > items . names = MEM_callocN ( SEARCH_ITEMS * sizeof ( void * ) , " search names " ) ;
data - > items . pointers = MEM_callocN ( SEARCH_ITEMS * sizeof ( void * ) , " search pointers " ) ;
2009-06-24 16:44:54 +00:00
data - > items . icons = MEM_callocN ( SEARCH_ITEMS * sizeof ( int ) , " search icons " ) ;
2009-06-02 18:10:06 +00:00
for ( x1 = 0 ; x1 < SEARCH_ITEMS ; x1 + + )
data - > items . names [ x1 ] = MEM_callocN ( but - > hardmax + 1 , " search pointers " ) ;
return ar ;
}
void ui_searchbox_free ( bContext * C , ARegion * ar )
{
ui_remove_temporary_region ( C , CTX_wm_screen ( C ) , ar ) ;
}
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
/************************* Creating Menu Blocks **********************/
/* position block relative to but, result is in window space */
static void ui_block_position ( wmWindow * window , ARegion * butregion , uiBut * but , uiBlock * block )
{
uiBut * bt ;
uiSafetyRct * saferct ;
rctf butrct ;
float aspect ;
int xsize , ysize , xof = 0 , yof = 0 , center ;
short dir1 = 0 , dir2 = 0 ;
/* transform to window coordinates, using the source button region/block */
butrct . xmin = but - > x1 ; butrct . xmax = but - > x2 ;
butrct . ymin = but - > y1 ; butrct . ymax = but - > y2 ;
ui_block_to_window_fl ( butregion , but - > block , & butrct . xmin , & butrct . ymin ) ;
ui_block_to_window_fl ( butregion , but - > block , & butrct . xmax , & butrct . ymax ) ;
/* calc block rect */
if ( block - > minx = = 0.0f & & block - > maxx = = 0.0f ) {
if ( block - > buttons . first ) {
block - > minx = block - > miny = 10000 ;
block - > maxx = block - > maxy = - 10000 ;
bt = block - > buttons . first ;
while ( bt ) {
if ( bt - > x1 < block - > minx ) block - > minx = bt - > x1 ;
if ( bt - > y1 < block - > miny ) block - > miny = bt - > y1 ;
if ( bt - > x2 > block - > maxx ) block - > maxx = bt - > x2 ;
if ( bt - > y2 > block - > maxy ) block - > maxy = bt - > y2 ;
bt = bt - > next ;
}
}
else {
/* we're nice and allow empty blocks too */
block - > minx = block - > miny = 0 ;
block - > maxx = block - > maxy = 20 ;
}
}
aspect = ( float ) ( block - > maxx - block - > minx + 4 ) ;
ui_block_to_window_fl ( butregion , but - > block , & block - > minx , & block - > miny ) ;
ui_block_to_window_fl ( butregion , but - > block , & block - > maxx , & block - > maxy ) ;
//block->minx-= 2.0; block->miny-= 2.0;
//block->maxx+= 2.0; block->maxy+= 2.0;
xsize = block - > maxx - block - > minx + 4 ; // 4 for shadow
ysize = block - > maxy - block - > miny + 4 ;
aspect / = ( float ) xsize ;
if ( but ) {
int left = 0 , right = 0 , top = 0 , down = 0 ;
int winx , winy ;
wm_window_get_size ( window , & winx , & winy ) ;
if ( block - > direction & UI_CENTER ) center = ysize / 2 ;
else center = 0 ;
if ( butrct . xmin - xsize > 0.0 ) left = 1 ;
if ( butrct . xmax + xsize < winx ) right = 1 ;
if ( butrct . ymin - ysize + center > 0.0 ) down = 1 ;
if ( butrct . ymax + ysize - center < winy ) top = 1 ;
dir1 = block - > direction & UI_DIRECTION ;
/* secundary directions */
if ( dir1 & ( UI_TOP | UI_DOWN ) ) {
if ( dir1 & UI_LEFT ) dir2 = UI_LEFT ;
else if ( dir1 & UI_RIGHT ) dir2 = UI_RIGHT ;
dir1 & = ( UI_TOP | UI_DOWN ) ;
}
if ( dir2 = = 0 ) if ( dir1 = = UI_LEFT | | dir1 = = UI_RIGHT ) dir2 = UI_DOWN ;
if ( dir2 = = 0 ) if ( dir1 = = UI_TOP | | dir1 = = UI_DOWN ) dir2 = UI_LEFT ;
/* no space at all? dont change */
if ( left | | right ) {
if ( dir1 = = UI_LEFT & & left = = 0 ) dir1 = UI_RIGHT ;
if ( dir1 = = UI_RIGHT & & right = = 0 ) dir1 = UI_LEFT ;
/* this is aligning, not append! */
if ( dir2 = = UI_LEFT & & right = = 0 ) dir2 = UI_RIGHT ;
if ( dir2 = = UI_RIGHT & & left = = 0 ) dir2 = UI_LEFT ;
}
if ( down | | top ) {
if ( dir1 = = UI_TOP & & top = = 0 ) dir1 = UI_DOWN ;
if ( dir1 = = UI_DOWN & & down = = 0 ) dir1 = UI_TOP ;
if ( dir2 = = UI_TOP & & top = = 0 ) dir2 = UI_DOWN ;
if ( dir2 = = UI_DOWN & & down = = 0 ) dir2 = UI_TOP ;
}
if ( dir1 = = UI_LEFT ) {
xof = butrct . xmin - block - > maxx ;
if ( dir2 = = UI_TOP ) yof = butrct . ymin - block - > miny - center ;
else yof = butrct . ymax - block - > maxy + center ;
}
else if ( dir1 = = UI_RIGHT ) {
xof = butrct . xmax - block - > minx ;
if ( dir2 = = UI_TOP ) yof = butrct . ymin - block - > miny - center ;
else yof = butrct . ymax - block - > maxy + center ;
}
else if ( dir1 = = UI_TOP ) {
yof = butrct . ymax - block - > miny ;
if ( dir2 = = UI_RIGHT ) xof = butrct . xmax - block - > maxx ;
else xof = butrct . xmin - block - > minx ;
// changed direction?
if ( ( dir1 & block - > direction ) = = 0 ) {
if ( block - > direction & UI_SHIFT_FLIPPED )
xof + = dir2 = = UI_LEFT ? 25 : - 25 ;
uiBlockFlipOrder ( block ) ;
}
}
else if ( dir1 = = UI_DOWN ) {
yof = butrct . ymin - block - > maxy ;
if ( dir2 = = UI_RIGHT ) xof = butrct . xmax - block - > maxx ;
else xof = butrct . xmin - block - > minx ;
// changed direction?
if ( ( dir1 & block - > direction ) = = 0 ) {
if ( block - > direction & UI_SHIFT_FLIPPED )
xof + = dir2 = = UI_LEFT ? 25 : - 25 ;
uiBlockFlipOrder ( block ) ;
}
}
/* and now we handle the exception; no space below or to top */
if ( top = = 0 & & down = = 0 ) {
if ( dir1 = = UI_LEFT | | dir1 = = UI_RIGHT ) {
// align with bottom of screen
yof = ysize ;
}
}
/* or no space left or right */
if ( left = = 0 & & right = = 0 ) {
if ( dir1 = = UI_TOP | | dir1 = = UI_DOWN ) {
// align with left size of screen
xof = - block - > minx + 5 ;
}
}
// apply requested offset in the block
xof + = block - > xofs / block - > aspect ;
yof + = block - > yofs / block - > aspect ;
}
/* apply */
for ( bt = block - > buttons . first ; bt ; bt = bt - > next ) {
ui_block_to_window_fl ( butregion , but - > block , & bt - > x1 , & bt - > y1 ) ;
ui_block_to_window_fl ( butregion , but - > block , & bt - > x2 , & bt - > y2 ) ;
bt - > x1 + = xof ;
bt - > x2 + = xof ;
bt - > y1 + = yof ;
bt - > y2 + = yof ;
bt - > aspect = 1.0 ;
// ui_check_but recalculates drawstring size in pixels
ui_check_but ( bt ) ;
}
block - > minx + = xof ;
block - > miny + = yof ;
block - > maxx + = xof ;
block - > maxy + = yof ;
/* safety calculus */
if ( but ) {
float midx = ( butrct . xmin + butrct . xmax ) / 2.0 ;
float midy = ( butrct . ymin + butrct . ymax ) / 2.0 ;
/* when you are outside parent button, safety there should be smaller */
// parent button to left
if ( midx < block - > minx ) block - > safety . xmin = block - > minx - 3 ;
else block - > safety . xmin = block - > minx - 40 ;
// parent button to right
if ( midx > block - > maxx ) block - > safety . xmax = block - > maxx + 3 ;
else block - > safety . xmax = block - > maxx + 40 ;
// parent button on bottom
if ( midy < block - > miny ) block - > safety . ymin = block - > miny - 3 ;
else block - > safety . ymin = block - > miny - 40 ;
// parent button on top
if ( midy > block - > maxy ) block - > safety . ymax = block - > maxy + 3 ;
else block - > safety . ymax = block - > maxy + 40 ;
// exception for switched pulldowns...
if ( dir1 & & ( dir1 & block - > direction ) = = 0 ) {
if ( dir2 = = UI_RIGHT ) block - > safety . xmax = block - > maxx + 3 ;
if ( dir2 = = UI_LEFT ) block - > safety . xmin = block - > minx - 3 ;
}
block - > direction = dir1 ;
}
else {
block - > safety . xmin = block - > minx - 40 ;
block - > safety . ymin = block - > miny - 40 ;
block - > safety . xmax = block - > maxx + 40 ;
block - > safety . ymax = block - > maxy + 40 ;
}
/* keep a list of these, needed for pulldown menus */
saferct = MEM_callocN ( sizeof ( uiSafetyRct ) , " uiSafetyRct " ) ;
saferct - > parent = butrct ;
saferct - > safety = block - > safety ;
BLI_freelistN ( & block - > saferct ) ;
if ( but )
BLI_duplicatelist ( & block - > saferct , & but - > block - > saferct ) ;
BLI_addhead ( & block - > saferct , saferct ) ;
}
static void ui_block_region_draw ( const bContext * C , ARegion * ar )
{
uiBlock * block ;
2008-12-15 19:19:39 +00:00
for ( block = ar - > uiblocks . first ; block ; block = block - > next )
2008-12-26 13:11:04 +00:00
uiDrawBlock ( C , block ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiPopupBlockHandle * ui_popup_block_create ( bContext * C , ARegion * butregion , uiBut * but , uiBlockCreateFunc create_func , uiBlockHandleCreateFunc handle_create_func , void * arg )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
2008-12-18 02:56:48 +00:00
wmWindow * window = CTX_wm_window ( C ) ;
2008-12-08 15:02:57 +00:00
static ARegionType type ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
ARegion * ar ;
uiBlock * block ;
uiBut * bt ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiPopupBlockHandle * handle ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
uiSafetyRct * saferct ;
/* create handle */
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
handle = MEM_callocN ( sizeof ( uiPopupBlockHandle ) , " uiPopupBlockHandle " ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
2009-02-07 16:43:55 +00:00
/* store context for operator */
handle - > ctx_area = CTX_wm_area ( C ) ;
handle - > ctx_region = CTX_wm_region ( C ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
/* create area region */
2008-12-18 02:56:48 +00:00
ar = ui_add_temporary_region ( CTX_wm_screen ( C ) ) ;
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
handle - > region = ar ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
2008-12-08 15:02:57 +00:00
memset ( & type , 0 , sizeof ( ARegionType ) ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
type . draw = ui_block_region_draw ;
ar - > type = & type ;
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
UI_add_region_handlers ( & ar - > handlers ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
/* create ui block */
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
if ( create_func )
block = create_func ( C , handle - > region , arg ) ;
else
block = handle_create_func ( C , handle , arg ) ;
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
if ( block - > handle ) {
memcpy ( block - > handle , handle , sizeof ( uiPopupBlockHandle ) ) ;
MEM_freeN ( handle ) ;
handle = block - > handle ;
}
else
block - > handle = handle ;
ar - > regiondata = handle ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
2009-01-28 23:29:27 +00:00
if ( ! block - > endblock )
uiEndBlock ( C , block ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
/* if this is being created from a button */
if ( but ) {
2009-05-21 15:34:09 +00:00
if ( ELEM ( but - > type , BLOCK , PULLDOWN ) )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
block - > xofs = - 2 ; /* for proper alignment */
/* only used for automatic toolbox, so can set the shift flag */
if ( but - > flag & UI_MAKE_TOP ) {
block - > direction = UI_TOP | UI_SHIFT_FLIPPED ;
uiBlockFlipOrder ( block ) ;
}
if ( but - > flag & UI_MAKE_DOWN ) block - > direction = UI_DOWN | UI_SHIFT_FLIPPED ;
if ( but - > flag & UI_MAKE_LEFT ) block - > direction | = UI_LEFT ;
if ( but - > flag & UI_MAKE_RIGHT ) block - > direction | = UI_RIGHT ;
2008-12-18 02:56:48 +00:00
ui_block_position ( window , butregion , but , block ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
}
else {
/* keep a list of these, needed for pulldown menus */
saferct = MEM_callocN ( sizeof ( uiSafetyRct ) , " uiSafetyRct " ) ;
saferct - > safety = block - > safety ;
BLI_addhead ( & block - > saferct , saferct ) ;
2009-02-09 02:54:40 +00:00
block - > flag | = UI_BLOCK_POPUP ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
}
/* the block and buttons were positioned in window space as in 2.4x, now
* these menu blocks are regions so we bring it back to region space .
2009-01-04 00:05:40 +00:00
* additionally we add some padding for the menu shadow or rounded menus */
2009-04-15 14:43:54 +00:00
ar - > winrct . xmin = block - > minx - MENU_SHADOW_SIDE ;
ar - > winrct . xmax = block - > maxx + MENU_SHADOW_SIDE ;
ar - > winrct . ymin = block - > miny - MENU_SHADOW_BOTTOM ;
ar - > winrct . ymax = block - > maxy + MENU_TOP ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
block - > minx - = ar - > winrct . xmin ;
block - > maxx - = ar - > winrct . xmin ;
block - > miny - = ar - > winrct . ymin ;
block - > maxy - = ar - > winrct . ymin ;
for ( bt = block - > buttons . first ; bt ; bt = bt - > next ) {
bt - > x1 - = ar - > winrct . xmin ;
bt - > x2 - = ar - > winrct . xmin ;
bt - > y1 - = ar - > winrct . ymin ;
bt - > y2 - = ar - > winrct . ymin ;
}
block - > flag | = UI_BLOCK_LOOP | UI_BLOCK_MOVEMOUSE_QUIT ;
2008-12-10 13:56:54 +00:00
/* adds subwindow */
ED_region_init ( C , ar ) ;
2008-12-15 19:19:39 +00:00
/* get winmat now that we actually have the subwindow */
2008-12-19 14:14:43 +00:00
wmSubWindowSet ( window , ar - > swinid ) ;
2009-04-03 16:26:03 +00:00
2008-12-18 02:56:48 +00:00
wm_subwindow_getmatrix ( window , ar - > swinid , block - > winmat ) ;
2008-12-10 13:56:54 +00:00
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
/* notify change and redraw */
2008-12-16 12:28:00 +00:00
ED_region_tag_redraw ( ar ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
return handle ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
void ui_popup_block_free ( bContext * C , uiPopupBlockHandle * handle )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
2009-06-03 18:31:37 +00:00
/* XXX ton added, chrash on load file with popup open... need investigate */
if ( CTX_wm_screen ( C ) )
ui_remove_temporary_region ( C , CTX_wm_screen ( C ) , handle - > region ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
MEM_freeN ( handle ) ;
}
/***************************** Menu Button ***************************/
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiBlock * ui_block_func_MENU ( bContext * C , uiPopupBlockHandle * handle , void * arg_but )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
uiBut * but = arg_but ;
uiBlock * block ;
uiBut * bt ;
MenuData * md ;
ListBase lb ;
float aspect ;
int width , height , boxh , columns , rows , startx , starty , x1 , y1 , xmax , a ;
/* create the block */
2.5
More cleanup!
- removed old UI font completely, including from uiBeginBlock
- emboss hints for uiBlock only have three types now;
Regular, Pulldown, or "Nothing" (only icon/text)
- removed old font path from Userdef
- removed all old button theme hinting
- removed old "auto block" to merge buttons in groups
(was only in use for radiosity buttons)
And went over all warnings. One hooray for make giving clean output :)
Well, we need uniform definitions for warnings, so people at least fix
them... here's the real bad bugs I found:
- in mesh code, a call to editmesh mixed *em and *me
- in armature, ED_util.h was not included, so no warnings for wrong call
to ED_undo_push()
- The extern Py api .h was not included in the bpy_interface.c, showing
a several calls using different args.
Further just added the missing includes, and removed unused vars.
2009-04-14 15:59:52 +00:00
block = uiBeginBlock ( C , handle - > region , " menu " , UI_EMBOSSP ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
block - > flag = UI_BLOCK_LOOP | UI_BLOCK_REDRAW | UI_BLOCK_NUMSELECT ;
/* compute menu data */
md = decompose_menu_string ( but - > str ) ;
/* columns and row calculation */
columns = ( md - > nitems + 20 ) / 20 ;
if ( columns < 1 )
columns = 1 ;
if ( columns > 8 )
columns = ( md - > nitems + 25 ) / 25 ;
rows = md - > nitems / columns ;
if ( rows < 1 )
rows = 1 ;
while ( rows * columns < md - > nitems )
rows + + ;
/* prevent scaling up of pupmenu */
2009-06-10 11:43:21 +00:00
aspect = but - > block - > aspect ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
if ( aspect < 1.0f )
aspect = 1.0f ;
/* size and location */
if ( md - > title )
2009-04-10 16:30:28 +00:00
width = 1.5 * aspect * strlen ( md - > title ) + UI_GetStringWidth ( md - > title ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
else
width = 0 ;
for ( a = 0 ; a < md - > nitems ; a + + ) {
2009-04-10 16:30:28 +00:00
xmax = aspect * UI_GetStringWidth ( md - > items [ a ] . str ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
if ( md - > items [ a ] . icon )
xmax + = 20 * aspect ;
if ( xmax > width )
width = xmax ;
}
width + = 10 ;
if ( width < ( but - > x2 - but - > x1 ) )
width = ( but - > x2 - but - > x1 ) ;
if ( width < 50 )
width = 50 ;
2009-06-10 11:43:21 +00:00
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
boxh = MENU_BUTTON_HEIGHT ;
height = rows * boxh ;
if ( md - > title )
height + = boxh ;
/* here we go! */
startx = but - > x1 ;
starty = but - > y1 ;
if ( md - > title ) {
uiBut * bt ;
2009-04-10 14:06:24 +00:00
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
if ( md - > titleicon ) {
bt = uiDefIconTextBut ( block , LABEL , 0 , md - > titleicon , md - > title , startx , ( short ) ( starty + rows * boxh ) , ( short ) width , ( short ) boxh , NULL , 0.0 , 0.0 , 0 , 0 , " " ) ;
} else {
bt = uiDefBut ( block , LABEL , 0 , md - > title , startx , ( short ) ( starty + rows * boxh ) , ( short ) width , ( short ) boxh , NULL , 0.0 , 0.0 , 0 , 0 , " " ) ;
bt - > flag = UI_TEXT_LEFT ;
}
}
for ( a = 0 ; a < md - > nitems ; a + + ) {
x1 = startx + width * ( ( int ) ( md - > nitems - a - 1 ) / rows ) ;
y1 = starty - boxh * ( rows - ( ( md - > nitems - a - 1 ) % rows ) ) + ( rows * boxh ) ;
if ( strcmp ( md - > items [ md - > nitems - a - 1 ] . str , " %l " ) = = 0 ) {
bt = uiDefBut ( block , SEPR , B_NOP , " " , x1 , y1 , ( short ) ( width - ( rows > 1 ) ) , ( short ) ( boxh - 1 ) , NULL , 0.0 , 0.0 , 0 , 0 , " " ) ;
}
else if ( md - > items [ md - > nitems - a - 1 ] . icon ) {
bt = uiDefIconTextButF ( block , BUTM | FLO , B_NOP , md - > items [ md - > nitems - a - 1 ] . icon , md - > items [ md - > nitems - a - 1 ] . str , x1 , y1 , ( short ) ( width - ( rows > 1 ) ) , ( short ) ( boxh - 1 ) , & handle - > retvalue , ( float ) md - > items [ md - > nitems - a - 1 ] . retval , 0.0 , 0 , 0 , " " ) ;
}
else {
bt = uiDefButF ( block , BUTM | FLO , B_NOP , md - > items [ md - > nitems - a - 1 ] . str , x1 , y1 , ( short ) ( width - ( rows > 1 ) ) , ( short ) ( boxh - 1 ) , & handle - > retvalue , ( float ) md - > items [ md - > nitems - a - 1 ] . retval , 0.0 , 0 , 0 , " " ) ;
}
}
menudata_free ( md ) ;
/* the code up here has flipped locations, because of change of preferred order */
/* thats why we have to switch list order too, to make arrowkeys work */
lb . first = lb . last = NULL ;
bt = block - > buttons . first ;
while ( bt ) {
uiBut * next = bt - > next ;
BLI_remlink ( & block - > buttons , bt ) ;
BLI_addhead ( & lb , bt ) ;
bt = next ;
}
block - > buttons = lb ;
block - > direction = UI_TOP ;
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
uiEndBlock ( C , block ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
return block ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiBlock * ui_block_func_ICONROW ( bContext * C , uiPopupBlockHandle * handle , void * arg_but )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
uiBut * but = arg_but ;
uiBlock * block ;
int a ;
2.5
More cleanup!
- removed old UI font completely, including from uiBeginBlock
- emboss hints for uiBlock only have three types now;
Regular, Pulldown, or "Nothing" (only icon/text)
- removed old font path from Userdef
- removed all old button theme hinting
- removed old "auto block" to merge buttons in groups
(was only in use for radiosity buttons)
And went over all warnings. One hooray for make giving clean output :)
Well, we need uniform definitions for warnings, so people at least fix
them... here's the real bad bugs I found:
- in mesh code, a call to editmesh mixed *em and *me
- in armature, ED_util.h was not included, so no warnings for wrong call
to ED_undo_push()
- The extern Py api .h was not included in the bpy_interface.c, showing
a several calls using different args.
Further just added the missing includes, and removed unused vars.
2009-04-14 15:59:52 +00:00
block = uiBeginBlock ( C , handle - > region , " menu " , UI_EMBOSSP ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
block - > flag = UI_BLOCK_LOOP | UI_BLOCK_REDRAW | UI_BLOCK_NUMSELECT ;
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
for ( a = ( int ) but - > hardmin ; a < = ( int ) but - > hardmax ; a + + ) {
uiDefIconButF ( block , BUTM | FLO , B_NOP , but - > icon + ( a - but - > hardmin ) , 0 , ( short ) ( 18 * a ) , ( short ) ( but - > x2 - but - > x1 - 4 ) , 18 , & handle - > retvalue , ( float ) a , 0.0 , 0 , 0 , " " ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
}
block - > direction = UI_TOP ;
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
uiEndBlock ( C , block ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
return block ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiBlock * ui_block_func_ICONTEXTROW ( bContext * C , uiPopupBlockHandle * handle , void * arg_but )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
uiBut * but = arg_but ;
uiBlock * block ;
MenuData * md ;
int width , xmax , ypos , a ;
2.5
More cleanup!
- removed old UI font completely, including from uiBeginBlock
- emboss hints for uiBlock only have three types now;
Regular, Pulldown, or "Nothing" (only icon/text)
- removed old font path from Userdef
- removed all old button theme hinting
- removed old "auto block" to merge buttons in groups
(was only in use for radiosity buttons)
And went over all warnings. One hooray for make giving clean output :)
Well, we need uniform definitions for warnings, so people at least fix
them... here's the real bad bugs I found:
- in mesh code, a call to editmesh mixed *em and *me
- in armature, ED_util.h was not included, so no warnings for wrong call
to ED_undo_push()
- The extern Py api .h was not included in the bpy_interface.c, showing
a several calls using different args.
Further just added the missing includes, and removed unused vars.
2009-04-14 15:59:52 +00:00
block = uiBeginBlock ( C , handle - > region , " menu " , UI_EMBOSSP ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
block - > flag = UI_BLOCK_LOOP | UI_BLOCK_REDRAW | UI_BLOCK_NUMSELECT ;
md = decompose_menu_string ( but - > str ) ;
/* size and location */
/* expand menu width to fit labels */
if ( md - > title )
2009-04-10 16:30:28 +00:00
width = 2 * strlen ( md - > title ) + UI_GetStringWidth ( md - > title ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
else
width = 0 ;
for ( a = 0 ; a < md - > nitems ; a + + ) {
2009-04-10 16:30:28 +00:00
xmax = UI_GetStringWidth ( md - > items [ a ] . str ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
if ( xmax > width ) width = xmax ;
}
width + = 30 ;
if ( width < 50 ) width = 50 ;
ypos = 1 ;
/* loop through the menu options and draw them out with icons & text labels */
for ( a = 0 ; a < md - > nitems ; a + + ) {
/* add a space if there's a separator (%l) */
if ( strcmp ( md - > items [ a ] . str , " %l " ) = = 0 ) {
ypos + = 3 ;
}
else {
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
uiDefIconTextButF ( block , BUTM | FLO , B_NOP , ( short ) ( ( but - > icon ) + ( md - > items [ a ] . retval - but - > hardmin ) ) , md - > items [ a ] . str , 0 , ypos , ( short ) width , 19 , & handle - > retvalue , ( float ) md - > items [ a ] . retval , 0.0 , 0 , 0 , " " ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
ypos + = 20 ;
}
}
if ( md - > title ) {
uiBut * bt ;
2009-04-10 14:06:24 +00:00
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
bt = uiDefBut ( block , LABEL , 0 , md - > title , 0 , ypos , ( short ) width , 19 , NULL , 0.0 , 0.0 , 0 , 0 , " " ) ;
bt - > flag = UI_TEXT_LEFT ;
}
menudata_free ( md ) ;
block - > direction = UI_TOP ;
uiBoundsBlock ( block , 3 ) ;
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
uiEndBlock ( C , block ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
return block ;
}
static void ui_warp_pointer ( short x , short y )
{
/* XXX 2.50 which function to use for this? */
#if 0
/* OSX has very poor mousewarp support, it sends events;
this causes a menu being pressed immediately . . . */
# ifndef __APPLE__
warp_pointer ( x , y ) ;
# endif
# endif
}
/********************* Color Button ****************/
/* picker sizes S hsize, F full size, D spacer, B button/pallette height */
# define SPICK 110.0
# define FPICK 180.0
# define DPICK 6.0
# define BPICK 24.0
# define UI_PALETTE_TOT 16
/* note; in tot+1 the old color is stored */
static float palette [ UI_PALETTE_TOT + 1 ] [ 3 ] = {
{ 0.93 , 0.83 , 0.81 } , { 0.88 , 0.89 , 0.73 } , { 0.69 , 0.81 , 0.57 } , { 0.51 , 0.76 , 0.64 } ,
{ 0.37 , 0.56 , 0.61 } , { 0.33 , 0.29 , 0.55 } , { 0.46 , 0.21 , 0.51 } , { 0.40 , 0.12 , 0.18 } ,
{ 1.0 , 1.0 , 1.0 } , { 0.85 , 0.85 , 0.85 } , { 0.7 , 0.7 , 0.7 } , { 0.56 , 0.56 , 0.56 } ,
{ 0.42 , 0.42 , 0.42 } , { 0.28 , 0.28 , 0.28 } , { 0.14 , 0.14 , 0.14 } , { 0.0 , 0.0 , 0.0 }
} ;
/* for picker, while editing hsv */
void ui_set_but_hsv ( uiBut * but )
{
float col [ 3 ] ;
hsv_to_rgb ( but - > hsv [ 0 ] , but - > hsv [ 1 ] , but - > hsv [ 2 ] , col , col + 1 , col + 2 ) ;
ui_set_but_vectorf ( but , col ) ;
}
static void update_picker_hex ( uiBlock * block , float * rgb )
{
uiBut * bt ;
char col [ 16 ] ;
sprintf ( col , " %02X%02X%02X " , ( unsigned int ) ( rgb [ 0 ] * 255.0 ) , ( unsigned int ) ( rgb [ 1 ] * 255.0 ) , ( unsigned int ) ( rgb [ 2 ] * 255.0 ) ) ;
// this updates button strings, is hackish... but button pointers are on stack of caller function
for ( bt = block - > buttons . first ; bt ; bt = bt - > next ) {
if ( strcmp ( bt - > str , " Hex: " ) = = 0 ) {
strcpy ( bt - > poin , col ) ;
ui_check_but ( bt ) ;
break ;
}
}
}
2009-06-12 14:22:27 +00:00
/* also used by small picker, be careful with name checks below... */
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
void ui_update_block_buts_hsv ( uiBlock * block , float * hsv )
{
uiBut * bt ;
float r , g , b ;
float rgb [ 3 ] ;
// this updates button strings, is hackish... but button pointers are on stack of caller function
hsv_to_rgb ( hsv [ 0 ] , hsv [ 1 ] , hsv [ 2 ] , & r , & g , & b ) ;
rgb [ 0 ] = r ; rgb [ 1 ] = g ; rgb [ 2 ] = b ;
update_picker_hex ( block , rgb ) ;
for ( bt = block - > buttons . first ; bt ; bt = bt - > next ) {
2009-06-12 14:22:27 +00:00
if ( ELEM ( bt - > type , HSVCUBE , HSVCIRCLE ) ) {
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
VECCOPY ( bt - > hsv , hsv ) ;
ui_set_but_hsv ( bt ) ;
}
else if ( bt - > str [ 1 ] = = ' ' ) {
if ( bt - > str [ 0 ] = = ' R ' ) {
ui_set_but_val ( bt , r ) ;
}
else if ( bt - > str [ 0 ] = = ' G ' ) {
ui_set_but_val ( bt , g ) ;
}
else if ( bt - > str [ 0 ] = = ' B ' ) {
ui_set_but_val ( bt , b ) ;
}
else if ( bt - > str [ 0 ] = = ' H ' ) {
ui_set_but_val ( bt , hsv [ 0 ] ) ;
}
else if ( bt - > str [ 0 ] = = ' S ' ) {
ui_set_but_val ( bt , hsv [ 1 ] ) ;
}
else if ( bt - > str [ 0 ] = = ' V ' ) {
ui_set_but_val ( bt , hsv [ 2 ] ) ;
}
}
}
}
static void ui_update_block_buts_hex ( uiBlock * block , char * hexcol )
{
uiBut * bt ;
float r = 0 , g = 0 , b = 0 ;
float h , s , v ;
// this updates button strings, is hackish... but button pointers are on stack of caller function
hex_to_rgb ( hexcol , & r , & g , & b ) ;
rgb_to_hsv ( r , g , b , & h , & s , & v ) ;
for ( bt = block - > buttons . first ; bt ; bt = bt - > next ) {
if ( bt - > type = = HSVCUBE ) {
bt - > hsv [ 0 ] = h ;
bt - > hsv [ 1 ] = s ;
bt - > hsv [ 2 ] = v ;
ui_set_but_hsv ( bt ) ;
}
else if ( bt - > str [ 1 ] = = ' ' ) {
if ( bt - > str [ 0 ] = = ' R ' ) {
ui_set_but_val ( bt , r ) ;
}
else if ( bt - > str [ 0 ] = = ' G ' ) {
ui_set_but_val ( bt , g ) ;
}
else if ( bt - > str [ 0 ] = = ' B ' ) {
ui_set_but_val ( bt , b ) ;
}
else if ( bt - > str [ 0 ] = = ' H ' ) {
ui_set_but_val ( bt , h ) ;
}
else if ( bt - > str [ 0 ] = = ' S ' ) {
ui_set_but_val ( bt , s ) ;
}
else if ( bt - > str [ 0 ] = = ' V ' ) {
ui_set_but_val ( bt , v ) ;
}
}
}
}
/* bt1 is palette but, col1 is original color */
/* callback to copy from/to palette */
2008-12-10 19:22:10 +00:00
static void do_palette_cb ( bContext * C , void * bt1 , void * col1 )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
2009-05-20 14:32:15 +00:00
wmWindow * win = CTX_wm_window ( C ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
uiBut * but1 = ( uiBut * ) bt1 ;
2009-06-12 13:53:08 +00:00
uiPopupBlockHandle * popup = but1 - > block - > handle ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
float * col = ( float * ) col1 ;
float * fp , hsv [ 3 ] ;
fp = ( float * ) but1 - > poin ;
2009-05-20 14:32:15 +00:00
if ( win - > eventstate - > ctrl ) {
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
VECCOPY ( fp , col ) ;
}
2009-05-20 14:32:15 +00:00
else {
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
VECCOPY ( col , fp ) ;
}
rgb_to_hsv ( col [ 0 ] , col [ 1 ] , col [ 2 ] , hsv , hsv + 1 , hsv + 2 ) ;
ui_update_block_buts_hsv ( but1 - > block , hsv ) ;
update_picker_hex ( but1 - > block , col ) ;
2009-06-12 13:53:08 +00:00
if ( popup )
popup - > menuretval = UI_RETURN_UPDATE ;
}
static void do_hsv_cb ( bContext * C , void * bt1 , void * unused )
{
uiBut * but1 = ( uiBut * ) bt1 ;
uiPopupBlockHandle * popup = but1 - > block - > handle ;
if ( popup )
popup - > menuretval = UI_RETURN_UPDATE ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
}
/* bt1 is num but, hsv1 is pointer to original color in hsv space*/
/* callback to handle changes in num-buts in picker */
2008-12-10 19:22:10 +00:00
static void do_palette1_cb ( bContext * C , void * bt1 , void * hsv1 )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
uiBut * but1 = ( uiBut * ) bt1 ;
2009-06-12 13:53:08 +00:00
uiPopupBlockHandle * popup = but1 - > block - > handle ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
float * hsv = ( float * ) hsv1 ;
float * fp = NULL ;
if ( but1 - > str [ 1 ] = = ' ' ) {
if ( but1 - > str [ 0 ] = = ' R ' ) fp = ( float * ) but1 - > poin ;
else if ( but1 - > str [ 0 ] = = ' G ' ) fp = ( ( float * ) but1 - > poin ) - 1 ;
else if ( but1 - > str [ 0 ] = = ' B ' ) fp = ( ( float * ) but1 - > poin ) - 2 ;
}
if ( fp ) {
rgb_to_hsv ( fp [ 0 ] , fp [ 1 ] , fp [ 2 ] , hsv , hsv + 1 , hsv + 2 ) ;
}
ui_update_block_buts_hsv ( but1 - > block , hsv ) ;
2009-06-12 13:53:08 +00:00
if ( popup )
popup - > menuretval = UI_RETURN_UPDATE ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
}
/* bt1 is num but, col1 is pointer to original color */
/* callback to handle changes in num-buts in picker */
2008-12-10 19:22:10 +00:00
static void do_palette2_cb ( bContext * C , void * bt1 , void * col1 )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
uiBut * but1 = ( uiBut * ) bt1 ;
2009-06-12 13:53:08 +00:00
uiPopupBlockHandle * popup = but1 - > block - > handle ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
float * rgb = ( float * ) col1 ;
float * fp = NULL ;
if ( but1 - > str [ 1 ] = = ' ' ) {
if ( but1 - > str [ 0 ] = = ' H ' ) fp = ( float * ) but1 - > poin ;
else if ( but1 - > str [ 0 ] = = ' S ' ) fp = ( ( float * ) but1 - > poin ) - 1 ;
else if ( but1 - > str [ 0 ] = = ' V ' ) fp = ( ( float * ) but1 - > poin ) - 2 ;
}
if ( fp ) {
hsv_to_rgb ( fp [ 0 ] , fp [ 1 ] , fp [ 2 ] , rgb , rgb + 1 , rgb + 2 ) ;
}
ui_update_block_buts_hsv ( but1 - > block , fp ) ;
2009-06-12 13:53:08 +00:00
if ( popup )
popup - > menuretval = UI_RETURN_UPDATE ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
}
2008-12-10 19:22:10 +00:00
static void do_palette_hex_cb ( bContext * C , void * bt1 , void * hexcl )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
uiBut * but1 = ( uiBut * ) bt1 ;
2009-06-12 13:53:08 +00:00
uiPopupBlockHandle * popup = but1 - > block - > handle ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
char * hexcol = ( char * ) hexcl ;
ui_update_block_buts_hex ( but1 - > block , hexcol ) ;
2009-06-12 13:53:08 +00:00
if ( popup )
popup - > menuretval = UI_RETURN_UPDATE ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
}
/* used for both 3d view and image window */
2008-12-10 19:22:10 +00:00
static void do_palette_sample_cb ( bContext * C , void * bt1 , void * col1 ) /* frontbuf */
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
/* XXX 2.50 this should become an operator? */
#if 0
uiBut * but1 = ( uiBut * ) bt1 ;
uiBut * but ;
float tempcol [ 4 ] ;
int x = 0 , y = 0 ;
short mval [ 2 ] ;
float hsv [ 3 ] ;
short capturing ;
int oldcursor ;
Window * win ;
unsigned short dev ;
oldcursor = get_cursor ( ) ;
win = winlay_get_active_window ( ) ;
while ( get_mbut ( ) & L_MOUSE ) UI_wait_for_statechange ( ) ;
SetBlenderCursor ( BC_EYEDROPPER_CURSOR ) ;
/* loop and wait for a mouse click */
capturing = TRUE ;
while ( capturing ) {
char ascii ;
short val ;
dev = extern_qread_ext ( & val , & ascii ) ;
if ( dev = = INPUTCHANGE ) break ;
if ( get_mbut ( ) & R_MOUSE ) break ;
else if ( get_mbut ( ) & L_MOUSE ) {
uiGetMouse ( mywinget ( ) , mval ) ;
x = mval [ 0 ] ; y = mval [ 1 ] ;
capturing = FALSE ;
break ;
}
else if ( dev = = ESCKEY ) break ;
}
window_set_cursor ( win , oldcursor ) ;
if ( capturing ) return ;
if ( x < 0 | | y < 0 ) return ;
/* if we've got a glick, use OpenGL to sample the color under the mouse pointer */
glReadBuffer ( GL_FRONT ) ;
glReadPixels ( x , y , 1 , 1 , GL_RGBA , GL_FLOAT , tempcol ) ;
glReadBuffer ( GL_BACK ) ;
/* and send that color back to the picker */
rgb_to_hsv ( tempcol [ 0 ] , tempcol [ 1 ] , tempcol [ 2 ] , hsv , hsv + 1 , hsv + 2 ) ;
ui_update_block_buts_hsv ( but1 - > block , hsv ) ;
update_picker_hex ( but1 - > block , tempcol ) ;
for ( but = but1 - > block - > buttons . first ; but ; but = but - > next ) {
ui_check_but ( but ) ;
ui_draw_but ( but ) ;
}
but = but1 - > block - > buttons . first ;
ui_block_flush_back ( but - > block ) ;
# endif
}
/* color picker, Gimp version. mode: 'f' = floating panel, 'p' = popup */
/* col = read/write to, hsv/old/hexcol = memory for temporal use */
void uiBlockPickerButtons ( uiBlock * block , float * col , float * hsv , float * old , char * hexcol , char mode , short retval )
{
uiBut * bt ;
float h , offs ;
int a ;
VECCOPY ( old , col ) ; // old color stored there, for palette_cb to work
// the cube intersection
bt = uiDefButF ( block , HSVCUBE , retval , " " , 0 , DPICK + BPICK , FPICK , FPICK , col , 0.0 , 0.0 , 2 , 0 , " " ) ;
2009-06-12 13:53:08 +00:00
uiButSetFunc ( bt , do_hsv_cb , bt , NULL ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
bt = uiDefButF ( block , HSVCUBE , retval , " " , 0 , 0 , FPICK , BPICK , col , 0.0 , 0.0 , 3 , 0 , " " ) ;
2009-06-12 13:53:08 +00:00
uiButSetFunc ( bt , do_hsv_cb , bt , NULL ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
// palette
bt = uiDefButF ( block , COL , retval , " " , FPICK + DPICK , 0 , BPICK , BPICK , old , 0.0 , 0.0 , - 1 , 0 , " Old color, click to restore " ) ;
uiButSetFunc ( bt , do_palette_cb , bt , col ) ;
uiDefButF ( block , COL , retval , " " , FPICK + DPICK , BPICK + DPICK , BPICK , 60 - BPICK - DPICK , col , 0.0 , 0.0 , - 1 , 0 , " Active color " ) ;
h = ( DPICK + BPICK + FPICK - 64 ) / ( UI_PALETTE_TOT / 2.0 ) ;
uiBlockBeginAlign ( block ) ;
for ( a = - 1 + UI_PALETTE_TOT / 2 ; a > = 0 ; a - - ) {
bt = uiDefButF ( block , COL , retval , " " , FPICK + DPICK , 65.0 + ( float ) a * h , BPICK / 2 , h , palette [ a + UI_PALETTE_TOT / 2 ] , 0.0 , 0.0 , - 1 , 0 , " Click to choose, hold CTRL to store in palette " ) ;
uiButSetFunc ( bt , do_palette_cb , bt , col ) ;
bt = uiDefButF ( block , COL , retval , " " , FPICK + DPICK + BPICK / 2 , 65.0 + ( float ) a * h , BPICK / 2 , h , palette [ a ] , 0.0 , 0.0 , - 1 , 0 , " Click to choose, hold CTRL to store in palette " ) ;
uiButSetFunc ( bt , do_palette_cb , bt , col ) ;
}
uiBlockEndAlign ( block ) ;
// buttons
rgb_to_hsv ( col [ 0 ] , col [ 1 ] , col [ 2 ] , hsv , hsv + 1 , hsv + 2 ) ;
sprintf ( hexcol , " %02X%02X%02X " , ( unsigned int ) ( col [ 0 ] * 255.0 ) , ( unsigned int ) ( col [ 1 ] * 255.0 ) , ( unsigned int ) ( col [ 2 ] * 255.0 ) ) ;
offs = FPICK + 2 * DPICK + BPICK ;
/* note; made this a TOG now, with NULL pointer. Is because BUT now gets handled with a afterfunc */
bt = uiDefIconTextBut ( block , TOG , UI_RETURN_OK , ICON_EYEDROPPER , " Sample " , offs + 55 , 170 , 85 , 20 , NULL , 0 , 0 , 0 , 0 , " Sample the color underneath the following mouse click (ESC or RMB to cancel) " ) ;
uiButSetFunc ( bt , do_palette_sample_cb , bt , col ) ;
uiButSetFlag ( bt , UI_TEXT_LEFT ) ;
bt = uiDefBut ( block , TEX , retval , " Hex: " , offs , 140 , 140 , 20 , hexcol , 0 , 8 , 0 , 0 , " Hex triplet for color (#RRGGBB) " ) ;
uiButSetFunc ( bt , do_palette_hex_cb , bt , hexcol ) ;
uiBlockBeginAlign ( block ) ;
bt = uiDefButF ( block , NUMSLI , retval , " R " , offs , 110 , 140 , 20 , col , 0.0 , 1.0 , 10 , 3 , " " ) ;
uiButSetFunc ( bt , do_palette1_cb , bt , hsv ) ;
bt = uiDefButF ( block , NUMSLI , retval , " G " , offs , 90 , 140 , 20 , col + 1 , 0.0 , 1.0 , 10 , 3 , " " ) ;
uiButSetFunc ( bt , do_palette1_cb , bt , hsv ) ;
bt = uiDefButF ( block , NUMSLI , retval , " B " , offs , 70 , 140 , 20 , col + 2 , 0.0 , 1.0 , 10 , 3 , " " ) ;
uiButSetFunc ( bt , do_palette1_cb , bt , hsv ) ;
uiBlockBeginAlign ( block ) ;
bt = uiDefButF ( block , NUMSLI , retval , " H " , offs , 40 , 140 , 20 , hsv , 0.0 , 1.0 , 10 , 3 , " " ) ;
uiButSetFunc ( bt , do_palette2_cb , bt , col ) ;
bt = uiDefButF ( block , NUMSLI , retval , " S " , offs , 20 , 140 , 20 , hsv + 1 , 0.0 , 1.0 , 10 , 3 , " " ) ;
uiButSetFunc ( bt , do_palette2_cb , bt , col ) ;
bt = uiDefButF ( block , NUMSLI , retval , " V " , offs , 0 , 140 , 20 , hsv + 2 , 0.0 , 1.0 , 10 , 3 , " " ) ;
uiButSetFunc ( bt , do_palette2_cb , bt , col ) ;
uiBlockEndAlign ( block ) ;
}
2009-06-12 14:22:27 +00:00
/* bt1 is num but, hsv1 is pointer to original color in hsv space*/
/* callback to handle changes */
static void do_picker_small_cb ( bContext * C , void * bt1 , void * hsv1 )
{
uiBut * but1 = ( uiBut * ) bt1 ;
uiPopupBlockHandle * popup = but1 - > block - > handle ;
float * hsv = ( float * ) hsv1 ;
float * fp = NULL ;
fp = ( float * ) but1 - > poin ;
rgb_to_hsv ( fp [ 0 ] , fp [ 1 ] , fp [ 2 ] , hsv , hsv + 1 , hsv + 2 ) ;
ui_update_block_buts_hsv ( but1 - > block , hsv ) ;
if ( popup )
popup - > menuretval = UI_RETURN_UPDATE ;
}
2009-06-24 13:44:19 +00:00
/* picker sizes S hsize, F full size, D spacer, B button/pallette height */
# define SPICK1 150.0
# define DPICK1 6.0
2009-06-12 14:22:27 +00:00
2009-06-24 13:44:19 +00:00
/* only the color, a HS circle and V slider */
static void uiBlockPickerSmall ( uiBlock * block , float * col , float * hsv , float * old , char * hexcol , char mode , short retval )
2009-06-12 14:22:27 +00:00
{
uiBut * bt ;
VECCOPY ( old , col ) ; // old color stored there, for palette_cb to work
/* HS circle */
2009-06-24 13:44:19 +00:00
bt = uiDefButF ( block , HSVCIRCLE , retval , " " , 0 , 0 , SPICK1 , SPICK1 , col , 0.0 , 0.0 , 0 , 0 , " " ) ;
uiButSetFunc ( bt , do_picker_small_cb , bt , hsv ) ;
/* value */
bt = uiDefButF ( block , HSVCUBE , retval , " " , SPICK1 + DPICK1 , 0 , 14 , SPICK1 , col , 0.0 , 0.0 , 4 , 0 , " " ) ;
2009-06-12 14:22:27 +00:00
uiButSetFunc ( bt , do_picker_small_cb , bt , hsv ) ;
2009-06-24 13:44:19 +00:00
}
2009-06-12 14:22:27 +00:00
2009-06-24 13:44:19 +00:00
static void picker_new_hide_reveal ( uiBlock * block , short colormode )
{
uiBut * bt ;
/* tag buttons */
for ( bt = block - > buttons . first ; bt ; bt = bt - > next ) {
if ( bt - > type = = NUMSLI | | bt - > type = = TEX ) {
if ( bt - > str [ 1 ] = = ' e ' ) {
if ( colormode = = 2 ) bt - > flag & = ~ UI_HIDDEN ;
else bt - > flag | = UI_HIDDEN ;
}
else if ( ELEM3 ( bt - > str [ 0 ] , ' R ' , ' G ' , ' B ' ) ) {
if ( colormode = = 0 ) bt - > flag & = ~ UI_HIDDEN ;
else bt - > flag | = UI_HIDDEN ;
}
else if ( ELEM3 ( bt - > str [ 0 ] , ' H ' , ' S ' , ' V ' ) ) {
if ( colormode = = 1 ) bt - > flag & = ~ UI_HIDDEN ;
else bt - > flag | = UI_HIDDEN ;
}
}
}
}
static void do_picker_new_mode_cb ( bContext * C , void * bt1 , void * colv )
{
uiBut * bt = bt1 ;
short colormode = ui_get_but_val ( bt ) ;
picker_new_hide_reveal ( bt - > block , colormode ) ;
}
/* a HS circle, V slider, rgb/hsv/hex sliders */
static void uiBlockPickerNew ( uiBlock * block , float * col , float * hsv , float * old , char * hexcol , char mode , short retval )
{
static short colormode = 0 ; /* temp? 0=rgb, 1=hsv, 2=hex */
uiBut * bt ;
int width ;
VECCOPY ( old , col ) ; // old color stored there, for palette_cb to work
/* HS circle */
bt = uiDefButF ( block , HSVCIRCLE , retval , " " , 0 , 0 , SPICK1 , SPICK1 , col , 0.0 , 0.0 , 0 , 0 , " " ) ;
uiButSetFunc ( bt , do_picker_small_cb , bt , hsv ) ;
2009-06-12 14:22:27 +00:00
/* value */
2009-06-24 13:44:19 +00:00
bt = uiDefButF ( block , HSVCUBE , retval , " " , SPICK1 + DPICK1 , 0 , 14 , SPICK1 , col , 0.0 , 0.0 , 4 , 0 , " " ) ;
2009-06-12 14:22:27 +00:00
uiButSetFunc ( bt , do_picker_small_cb , bt , hsv ) ;
2009-06-24 13:44:19 +00:00
/* mode */
width = ( SPICK1 + DPICK1 + 14 ) / 3 ;
uiBlockBeginAlign ( block ) ;
bt = uiDefButS ( block , ROW , retval , " RGB " , 0 , - 30 , width , 19 , & colormode , 0.0 , 0.0 , 0 , 0 , " " ) ;
uiButSetFunc ( bt , do_picker_new_mode_cb , bt , col ) ;
bt = uiDefButS ( block , ROW , retval , " HSV " , width , - 30 , width , 19 , & colormode , 0.0 , 1.0 , 0 , 0 , " " ) ;
uiButSetFunc ( bt , do_picker_new_mode_cb , bt , hsv ) ;
bt = uiDefButS ( block , ROW , retval , " Hex " , 2 * width , - 30 , width , 19 , & colormode , 0.0 , 2.0 , 0 , 0 , " " ) ;
uiButSetFunc ( bt , do_picker_new_mode_cb , bt , hexcol ) ;
uiBlockEndAlign ( block ) ;
/* sliders or hex */
width = ( SPICK1 + DPICK1 + 14 ) ;
rgb_to_hsv ( col [ 0 ] , col [ 1 ] , col [ 2 ] , hsv , hsv + 1 , hsv + 2 ) ;
sprintf ( hexcol , " %02X%02X%02X " , ( unsigned int ) ( col [ 0 ] * 255.0 ) , ( unsigned int ) ( col [ 1 ] * 255.0 ) , ( unsigned int ) ( col [ 2 ] * 255.0 ) ) ;
uiBlockBeginAlign ( block ) ;
bt = uiDefButF ( block , NUMSLI , 0 , " R " , 0 , - 60 , width , 19 , col , 0.0 , 1.0 , 10 , 3 , " " ) ;
uiButSetFunc ( bt , do_palette1_cb , bt , hsv ) ;
bt = uiDefButF ( block , NUMSLI , 0 , " G " , 0 , - 80 , width , 19 , col + 1 , 0.0 , 1.0 , 10 , 3 , " " ) ;
uiButSetFunc ( bt , do_palette1_cb , bt , hsv ) ;
bt = uiDefButF ( block , NUMSLI , 0 , " B " , 0 , - 100 , width , 19 , col + 2 , 0.0 , 1.0 , 10 , 3 , " " ) ;
uiButSetFunc ( bt , do_palette1_cb , bt , hsv ) ;
uiBlockEndAlign ( block ) ;
2009-06-12 14:22:27 +00:00
2009-06-24 13:44:19 +00:00
uiBlockBeginAlign ( block ) ;
bt = uiDefButF ( block , NUMSLI , 0 , " H " , 0 , - 60 , width , 19 , hsv , 0.0 , 1.0 , 10 , 3 , " " ) ;
uiButSetFunc ( bt , do_palette2_cb , bt , col ) ;
bt = uiDefButF ( block , NUMSLI , 0 , " S " , 0 , - 80 , width , 19 , hsv + 1 , 0.0 , 1.0 , 10 , 3 , " " ) ;
uiButSetFunc ( bt , do_palette2_cb , bt , col ) ;
bt = uiDefButF ( block , NUMSLI , 0 , " V " , 0 , - 100 , width , 19 , hsv + 2 , 0.0 , 1.0 , 10 , 3 , " " ) ;
uiButSetFunc ( bt , do_palette2_cb , bt , col ) ;
uiBlockEndAlign ( block ) ;
bt = uiDefBut ( block , TEX , 0 , " Hex: " , 0 , - 80 , width , 19 , hexcol , 0 , 8 , 0 , 0 , " Hex triplet for color (#RRGGBB) " ) ;
uiButSetFunc ( bt , do_palette_hex_cb , bt , hexcol ) ;
picker_new_hide_reveal ( block , colormode ) ;
2009-06-12 14:22:27 +00:00
}
2009-06-24 13:44:19 +00:00
2009-06-12 14:22:27 +00:00
static int ui_picker_small_wheel ( const bContext * C , uiBlock * block , wmEvent * event )
{
float add = 0.0f ;
if ( event - > type = = WHEELUPMOUSE )
add = 0.05f ;
else if ( event - > type = = WHEELDOWNMOUSE )
add = - 0.05f ;
if ( add ! = 0.0f ) {
uiBut * but ;
for ( but = block - > buttons . first ; but ; but = but - > next ) {
if ( but - > type = = HSVCUBE & & but - > active = = NULL ) {
uiPopupBlockHandle * popup = block - > handle ;
float col [ 3 ] ;
ui_get_but_vectorf ( but , col ) ;
rgb_to_hsv ( col [ 0 ] , col [ 1 ] , col [ 2 ] , but - > hsv , but - > hsv + 1 , but - > hsv + 2 ) ;
but - > hsv [ 2 ] = CLAMPIS ( but - > hsv [ 2 ] + add , 0.0f , 1.0f ) ;
hsv_to_rgb ( but - > hsv [ 0 ] , but - > hsv [ 1 ] , but - > hsv [ 2 ] , col , col + 1 , col + 2 ) ;
ui_set_but_vectorf ( but , col ) ;
ui_update_block_buts_hsv ( block , but - > hsv ) ;
if ( popup )
popup - > menuretval = UI_RETURN_UPDATE ;
return 1 ;
}
}
}
return 0 ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiBlock * ui_block_func_COL ( bContext * C , uiPopupBlockHandle * handle , void * arg_but )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
2009-06-12 14:22:27 +00:00
wmWindow * win = CTX_wm_window ( C ) ; // XXX temp, needs to become keymap to detect type?
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
uiBut * but = arg_but ;
uiBlock * block ;
static float hsvcol [ 3 ] , oldcol [ 3 ] ;
static char hexcol [ 128 ] ;
2.5
More cleanup!
- removed old UI font completely, including from uiBeginBlock
- emboss hints for uiBlock only have three types now;
Regular, Pulldown, or "Nothing" (only icon/text)
- removed old font path from Userdef
- removed all old button theme hinting
- removed old "auto block" to merge buttons in groups
(was only in use for radiosity buttons)
And went over all warnings. One hooray for make giving clean output :)
Well, we need uniform definitions for warnings, so people at least fix
them... here's the real bad bugs I found:
- in mesh code, a call to editmesh mixed *em and *me
- in armature, ED_util.h was not included, so no warnings for wrong call
to ED_undo_push()
- The extern Py api .h was not included in the bpy_interface.c, showing
a several calls using different args.
Further just added the missing includes, and removed unused vars.
2009-04-14 15:59:52 +00:00
block = uiBeginBlock ( C , handle - > region , " colorpicker " , UI_EMBOSS ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
VECCOPY ( handle - > retvec , but - > editvec ) ;
2009-06-12 14:22:27 +00:00
if ( win - > eventstate - > shift ) {
uiBlockPickerButtons ( block , handle - > retvec , hsvcol , oldcol , hexcol , ' p ' , 0 ) ;
block - > flag = UI_BLOCK_LOOP | UI_BLOCK_REDRAW | UI_BLOCK_KEEP_OPEN ;
uiBoundsBlock ( block , 3 ) ;
}
2009-06-24 13:44:19 +00:00
else if ( win - > eventstate - > alt ) {
2009-06-12 14:22:27 +00:00
uiBlockPickerSmall ( block , handle - > retvec , hsvcol , oldcol , hexcol , ' p ' , 0 ) ;
2009-06-12 15:11:51 +00:00
block - > flag = UI_BLOCK_LOOP | UI_BLOCK_REDRAW | UI_BLOCK_RET_1 | UI_BLOCK_OUT_1 ;
2009-06-12 14:22:27 +00:00
uiBoundsBlock ( block , 10 ) ;
block - > block_event_func = ui_picker_small_wheel ;
2009-06-24 13:44:19 +00:00
}
else {
uiBlockPickerNew ( block , handle - > retvec , hsvcol , oldcol , hexcol , ' p ' , 0 ) ;
block - > flag = UI_BLOCK_LOOP | UI_BLOCK_REDRAW | UI_BLOCK_KEEP_OPEN ;
uiBoundsBlock ( block , 10 ) ;
block - > block_event_func = ui_picker_small_wheel ;
}
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
/* and lets go */
block - > direction = UI_TOP ;
return block ;
}
2009-05-20 15:20:24 +00:00
/* ******************** Color band *************** */
static int vergcband ( const void * a1 , const void * a2 )
{
const CBData * x1 = a1 , * x2 = a2 ;
if ( x1 - > pos > x2 - > pos ) return 1 ;
else if ( x1 - > pos < x2 - > pos ) return - 1 ;
return 0 ;
}
static void colorband_pos_cb ( bContext * C , void * coba_v , void * unused_v )
{
ColorBand * coba = coba_v ;
int a ;
if ( coba - > tot < 2 ) return ;
for ( a = 0 ; a < coba - > tot ; a + + ) coba - > data [ a ] . cur = a ;
qsort ( coba - > data , coba - > tot , sizeof ( CBData ) , vergcband ) ;
for ( a = 0 ; a < coba - > tot ; a + + ) {
if ( coba - > data [ a ] . cur = = coba - > cur ) {
/* if(coba->cur!=a) addqueue(curarea->win, REDRAW, 0); */ /* button cur */
coba - > cur = a ;
break ;
}
}
}
static void colorband_add_cb ( bContext * C , void * coba_v , void * unused_v )
{
ColorBand * coba = coba_v ;
if ( coba - > tot < MAXCOLORBAND - 1 ) coba - > tot + + ;
coba - > cur = coba - > tot - 1 ;
colorband_pos_cb ( C , coba , NULL ) ;
// BIF_undo_push("Add colorband");
}
static void colorband_del_cb ( bContext * C , void * coba_v , void * unused_v )
{
ColorBand * coba = coba_v ;
int a ;
if ( coba - > tot < 2 ) return ;
for ( a = coba - > cur ; a < coba - > tot ; a + + ) {
coba - > data [ a ] = coba - > data [ a + 1 ] ;
}
if ( coba - > cur ) coba - > cur - - ;
coba - > tot - - ;
// BIF_undo_push("Delete colorband");
// BIF_preview_changed(ID_TE);
}
void uiBlockColorbandButtons ( uiBlock * block , ColorBand * coba , rctf * butr , int event )
{
CBData * cbd ;
uiBut * bt ;
float unit = ( butr - > xmax - butr - > xmin ) / 14.0f ;
float xs = butr - > xmin ;
cbd = coba - > data + coba - > cur ;
uiBlockBeginAlign ( block ) ;
uiDefButF ( block , COL , event , " " , xs , butr - > ymin + 20.0f , 2.0f * unit , 20 , & ( cbd - > r ) , 0 , 0 , 0 , 0 , " " ) ;
uiDefButF ( block , NUM , event , " A: " , xs + 2.0f * unit , butr - > ymin + 20.0f , 4.0f * unit , 20 , & ( cbd - > a ) , 0.0f , 1.0f , 10 , 2 , " " ) ;
bt = uiDefBut ( block , BUT , event , " Add " , xs + 6.0f * unit , butr - > ymin + 20.0f , 2.0f * unit , 20 , NULL , 0 , 0 , 0 , 0 , " Adds a new color position to the colorband " ) ;
uiButSetFunc ( bt , colorband_add_cb , coba , NULL ) ;
bt = uiDefBut ( block , BUT , event , " Del " , xs + 8.0f * unit , butr - > ymin + 20.0f , 2.0f * unit , 20 , NULL , 0 , 0 , 0 , 0 , " Deletes the active position " ) ;
uiButSetFunc ( bt , colorband_del_cb , coba , NULL ) ;
uiDefButS ( block , MENU , event , " Interpolation %t|Ease %x1|Cardinal %x3|Linear %x0|B-Spline %x2|Constant %x4 " ,
xs + 10.0f * unit , butr - > ymin + 20.0f , unit * 4.0f , 20 , & coba - > ipotype , 0.0 , 0.0 , 0 , 0 , " Sets interpolation type " ) ;
uiDefBut ( block , BUT_COLORBAND , event , " " , xs , butr - > ymin , butr - > xmax - butr - > xmin , 20.0f , coba , 0 , 0 , 0 , 0 , " " ) ;
uiBlockEndAlign ( block ) ;
}
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
/* ******************** PUPmenu ****************** */
static int pupmenu_set = 0 ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
void uiPupMenuSetActive ( int val )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
pupmenu_set = val ;
}
/* value== -1 read, otherwise set */
static int pupmenu_memory ( char * str , int value )
{
static char mem [ 256 ] , first = 1 ;
int val = 0 , nr = 0 ;
if ( first ) {
memset ( mem , 0 , 256 ) ;
first = 0 ;
}
while ( str [ nr ] ) {
val + = str [ nr ] ;
nr + + ;
}
if ( value > = 0 ) mem [ val & 255 ] = value ;
else return mem [ val & 255 ] ;
return 0 ;
}
# define PUP_LABELH 6
typedef struct uiPupMenuInfo {
char * instr ;
int mx , my ;
int startx , starty ;
int maxrow ;
} uiPupMenuInfo ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiBlock * ui_block_func_PUPMENU ( bContext * C , uiPopupBlockHandle * handle , void * arg_info )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
uiBlock * block ;
uiPupMenuInfo * info ;
int columns , rows , mousemove [ 2 ] = { 0 , 0 } , mousewarp = 0 ;
int width , height , xmax , ymax , maxrow ;
int a , startx , starty , endx , endy , x1 , y1 ;
int lastselected ;
MenuData * md ;
info = arg_info ;
maxrow = info - > maxrow ;
height = 0 ;
/* block stuff first, need to know the font */
2.5
More cleanup!
- removed old UI font completely, including from uiBeginBlock
- emboss hints for uiBlock only have three types now;
Regular, Pulldown, or "Nothing" (only icon/text)
- removed old font path from Userdef
- removed all old button theme hinting
- removed old "auto block" to merge buttons in groups
(was only in use for radiosity buttons)
And went over all warnings. One hooray for make giving clean output :)
Well, we need uniform definitions for warnings, so people at least fix
them... here's the real bad bugs I found:
- in mesh code, a call to editmesh mixed *em and *me
- in armature, ED_util.h was not included, so no warnings for wrong call
to ED_undo_push()
- The extern Py api .h was not included in the bpy_interface.c, showing
a several calls using different args.
Further just added the missing includes, and removed unused vars.
2009-04-14 15:59:52 +00:00
block = uiBeginBlock ( C , handle - > region , " menu " , UI_EMBOSSP ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
uiBlockSetFlag ( block , UI_BLOCK_LOOP | UI_BLOCK_REDRAW | UI_BLOCK_RET_1 | UI_BLOCK_NUMSELECT ) ;
2009-01-10 14:03:00 +00:00
block - > direction = UI_DOWN ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
md = decompose_menu_string ( info - > instr ) ;
rows = md - > nitems ;
2009-06-14 12:53:47 +00:00
if ( rows < 1 )
rows = 1 ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
columns = 1 ;
/* size and location, title slightly bigger for bold */
if ( md - > title ) {
2009-04-10 16:30:28 +00:00
width = 2 * strlen ( md - > title ) + UI_GetStringWidth ( md - > title ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
width / = columns ;
}
else width = 0 ;
for ( a = 0 ; a < md - > nitems ; a + + ) {
2009-04-10 16:30:28 +00:00
xmax = UI_GetStringWidth ( md - > items [ a ] . str ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
if ( xmax > width ) width = xmax ;
if ( strcmp ( md - > items [ a ] . str , " %l " ) = = 0 ) height + = PUP_LABELH ;
else height + = MENU_BUTTON_HEIGHT ;
}
width + = 10 ;
if ( width < 50 ) width = 50 ;
2008-12-18 02:56:48 +00:00
wm_window_get_size ( CTX_wm_window ( C ) , & xmax , & ymax ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
/* set first item */
lastselected = 0 ;
if ( pupmenu_set ) {
lastselected = pupmenu_set - 1 ;
pupmenu_set = 0 ;
}
else if ( md - > nitems > 1 ) {
lastselected = pupmenu_memory ( info - > instr , - 1 ) ;
}
startx = info - > mx - ( 0.8 * ( width ) ) ;
starty = info - > my - height + MENU_BUTTON_HEIGHT / 2 ;
if ( lastselected > = 0 & & lastselected < md - > nitems ) {
for ( a = 0 ; a < md - > nitems ; a + + ) {
if ( a = = lastselected ) break ;
if ( strcmp ( md - > items [ a ] . str , " %l " ) = = 0 ) starty + = PUP_LABELH ;
else starty + = MENU_BUTTON_HEIGHT ;
}
//starty= info->my-height+MENU_BUTTON_HEIGHT/2+lastselected*MENU_BUTTON_HEIGHT;
}
if ( startx < 10 ) {
startx = 10 ;
}
if ( starty < 10 ) {
mousemove [ 1 ] = 10 - starty ;
starty = 10 ;
}
endx = startx + width * columns ;
endy = starty + height ;
if ( endx > xmax ) {
endx = xmax - 10 ;
startx = endx - width * columns ;
}
if ( endy > ymax - 20 ) {
mousemove [ 1 ] = ymax - endy - 20 ;
endy = ymax - 20 ;
starty = endy - height ;
}
if ( mousemove [ 0 ] | | mousemove [ 1 ] ) {
ui_warp_pointer ( info - > mx + mousemove [ 0 ] , info - > my + mousemove [ 1 ] ) ;
mousemove [ 0 ] = info - > mx ;
mousemove [ 1 ] = info - > my ;
mousewarp = 1 ;
}
/* here we go! */
if ( md - > title ) {
uiBut * bt ;
char titlestr [ 256 ] ;
if ( md - > titleicon ) {
width + = 20 ;
sprintf ( titlestr , " %s " , md - > title ) ;
uiDefIconTextBut ( block , LABEL , 0 , md - > titleicon , titlestr , startx , ( short ) ( starty + height ) , width , MENU_BUTTON_HEIGHT , NULL , 0.0 , 0.0 , 0 , 0 , " " ) ;
}
else {
bt = uiDefBut ( block , LABEL , 0 , md - > title , startx , ( short ) ( starty + height ) , columns * width , MENU_BUTTON_HEIGHT , NULL , 0.0 , 0.0 , 0 , 0 , " " ) ;
bt - > flag = UI_TEXT_LEFT ;
}
2009-01-04 00:05:40 +00:00
//uiDefBut(block, SEPR, 0, "", startx, (short)(starty+height)-MENU_SEPR_HEIGHT, width, MENU_SEPR_HEIGHT, NULL, 0.0, 0.0, 0, 0, "");
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
}
x1 = startx + width * ( ( int ) a / rows ) ;
2009-01-04 00:05:40 +00:00
y1 = starty + height - MENU_BUTTON_HEIGHT ; // - MENU_SEPR_HEIGHT;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
for ( a = 0 ; a < md - > nitems ; a + + ) {
char * name = md - > items [ a ] . str ;
int icon = md - > items [ a ] . icon ;
if ( strcmp ( name , " %l " ) = = 0 ) {
uiDefBut ( block , SEPR , B_NOP , " " , x1 , y1 , width , PUP_LABELH , NULL , 0 , 0.0 , 0 , 0 , " " ) ;
y1 - = PUP_LABELH ;
}
else if ( icon ) {
uiDefIconButF ( block , BUTM , B_NOP , icon , x1 , y1 , width + 16 , MENU_BUTTON_HEIGHT - 1 , & handle - > retvalue , ( float ) md - > items [ a ] . retval , 0.0 , 0 , 0 , " " ) ;
y1 - = MENU_BUTTON_HEIGHT ;
}
else {
uiDefButF ( block , BUTM , B_NOP , name , x1 , y1 , width , MENU_BUTTON_HEIGHT - 1 , & handle - > retvalue , ( float ) md - > items [ a ] . retval , 0.0 , 0 , 0 , " " ) ;
y1 - = MENU_BUTTON_HEIGHT ;
}
}
uiBoundsBlock ( block , 1 ) ;
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
uiEndBlock ( C , block ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
menudata_free ( md ) ;
/* XXX 2.5 need to store last selected */
#if 0
/* calculate last selected */
if ( event & ui_return_ok ) {
lastselected = 0 ;
for ( a = 0 ; a < md - > nitems ; a + + ) {
if ( val = = md - > items [ a ] . retval ) lastselected = a ;
}
pupmenu_memory ( info - > instr , lastselected ) ;
}
# endif
/* XXX 2.5 need to warp back */
#if 0
if ( mousemove [ 1 ] & & ( event & ui_return_out ) = = 0 )
ui_warp_pointer ( mousemove [ 0 ] , mousemove [ 1 ] ) ;
return val ;
# endif
return block ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiBlock * ui_block_func_PUPMENUCOL ( bContext * C , uiPopupBlockHandle * handle , void * arg_info )
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
{
uiBlock * block ;
uiPupMenuInfo * info ;
int columns , rows , mousemove [ 2 ] = { 0 , 0 } , mousewarp ;
int width , height , xmax , ymax , maxrow ;
int a , startx , starty , endx , endy , x1 , y1 ;
float fvalue ;
MenuData * md ;
info = arg_info ;
maxrow = info - > maxrow ;
height = 0 ;
/* block stuff first, need to know the font */
2.5
More cleanup!
- removed old UI font completely, including from uiBeginBlock
- emboss hints for uiBlock only have three types now;
Regular, Pulldown, or "Nothing" (only icon/text)
- removed old font path from Userdef
- removed all old button theme hinting
- removed old "auto block" to merge buttons in groups
(was only in use for radiosity buttons)
And went over all warnings. One hooray for make giving clean output :)
Well, we need uniform definitions for warnings, so people at least fix
them... here's the real bad bugs I found:
- in mesh code, a call to editmesh mixed *em and *me
- in armature, ED_util.h was not included, so no warnings for wrong call
to ED_undo_push()
- The extern Py api .h was not included in the bpy_interface.c, showing
a several calls using different args.
Further just added the missing includes, and removed unused vars.
2009-04-14 15:59:52 +00:00
block = uiBeginBlock ( C , handle - > region , " menu " , UI_EMBOSSP ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
uiBlockSetFlag ( block , UI_BLOCK_LOOP | UI_BLOCK_REDRAW | UI_BLOCK_RET_1 | UI_BLOCK_NUMSELECT ) ;
2009-01-10 14:03:00 +00:00
block - > direction = UI_DOWN ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
md = decompose_menu_string ( info - > instr ) ;
/* columns and row calculation */
columns = ( md - > nitems + maxrow ) / maxrow ;
if ( columns < 1 ) columns = 1 ;
if ( columns > 8 ) {
maxrow + = 5 ;
columns = ( md - > nitems + maxrow ) / maxrow ;
}
rows = ( int ) md - > nitems / columns ;
if ( rows < 1 ) rows = 1 ;
while ( rows * columns < ( md - > nitems + columns ) ) rows + + ;
/* size and location, title slightly bigger for bold */
if ( md - > title ) {
2009-04-10 16:30:28 +00:00
width = 2 * strlen ( md - > title ) + UI_GetStringWidth ( md - > title ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
width / = columns ;
}
else width = 0 ;
for ( a = 0 ; a < md - > nitems ; a + + ) {
2009-04-10 16:30:28 +00:00
xmax = UI_GetStringWidth ( md - > items [ a ] . str ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
if ( xmax > width ) width = xmax ;
}
width + = 10 ;
if ( width < 50 ) width = 50 ;
height = rows * MENU_BUTTON_HEIGHT ;
if ( md - > title ) height + = MENU_BUTTON_HEIGHT ;
2008-12-18 02:56:48 +00:00
wm_window_get_size ( CTX_wm_window ( C ) , & xmax , & ymax ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
/* find active item */
fvalue = handle - > retvalue ;
for ( a = 0 ; a < md - > nitems ; a + + ) {
if ( md - > items [ a ] . retval = = ( int ) fvalue ) break ;
}
/* no active item? */
if ( a = = md - > nitems ) {
if ( md - > title ) a = - 1 ;
else a = 0 ;
}
if ( a > 0 )
startx = info - > mx - width / 2 - ( ( int ) ( a ) / rows ) * width ;
else
startx = info - > mx - width / 2 ;
starty = info - > my - height + MENU_BUTTON_HEIGHT / 2 + ( ( a ) % rows ) * MENU_BUTTON_HEIGHT ;
if ( md - > title ) starty + = MENU_BUTTON_HEIGHT ;
if ( startx < 10 ) {
mousemove [ 0 ] = 10 - startx ;
startx = 10 ;
}
if ( starty < 10 ) {
mousemove [ 1 ] = 10 - starty ;
starty = 10 ;
}
endx = startx + width * columns ;
endy = starty + height ;
if ( endx > xmax ) {
mousemove [ 0 ] = xmax - endx - 10 ;
endx = xmax - 10 ;
startx = endx - width * columns ;
}
if ( endy > ymax ) {
mousemove [ 1 ] = ymax - endy - 10 ;
endy = ymax - 10 ;
starty = endy - height ;
}
if ( mousemove [ 0 ] | | mousemove [ 1 ] ) {
ui_warp_pointer ( info - > mx + mousemove [ 0 ] , info - > my + mousemove [ 1 ] ) ;
mousemove [ 0 ] = info - > mx ;
mousemove [ 1 ] = info - > my ;
mousewarp = 1 ;
}
/* here we go! */
if ( md - > title ) {
uiBut * bt ;
if ( md - > titleicon ) {
}
else {
bt = uiDefBut ( block , LABEL , 0 , md - > title , startx , ( short ) ( starty + rows * MENU_BUTTON_HEIGHT ) , columns * width , MENU_BUTTON_HEIGHT , NULL , 0.0 , 0.0 , 0 , 0 , " " ) ;
bt - > flag = UI_TEXT_LEFT ;
}
}
for ( a = 0 ; a < md - > nitems ; a + + ) {
char * name = md - > items [ a ] . str ;
int icon = md - > items [ a ] . icon ;
x1 = startx + width * ( ( int ) a / rows ) ;
y1 = starty - MENU_BUTTON_HEIGHT * ( a % rows ) + ( rows - 1 ) * MENU_BUTTON_HEIGHT ;
if ( strcmp ( name , " %l " ) = = 0 ) {
uiDefBut ( block , SEPR , B_NOP , " " , x1 , y1 , width , PUP_LABELH , NULL , 0 , 0.0 , 0 , 0 , " " ) ;
y1 - = PUP_LABELH ;
}
else if ( icon ) {
uiDefIconButF ( block , BUTM , B_NOP , icon , x1 , y1 , width + 16 , MENU_BUTTON_HEIGHT - 1 , & handle - > retvalue , ( float ) md - > items [ a ] . retval , 0.0 , 0 , 0 , " " ) ;
y1 - = MENU_BUTTON_HEIGHT ;
}
else {
uiDefButF ( block , BUTM , B_NOP , name , x1 , y1 , width , MENU_BUTTON_HEIGHT - 1 , & handle - > retvalue , ( float ) md - > items [ a ] . retval , 0.0 , 0 , 0 , " " ) ;
y1 - = MENU_BUTTON_HEIGHT ;
}
}
uiBoundsBlock ( block , 1 ) ;
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
uiEndBlock ( C , block ) ;
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
menudata_free ( md ) ;
/* XXX 2.5 need to warp back */
#if 0
if ( ( event & UI_RETURN_OUT ) = = 0 )
ui_warp_pointer ( mousemove [ 0 ] , mousemove [ 1 ] ) ;
# endif
return block ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
/************************** Menu Definitions ***************************/
2009-01-25 20:22:05 +00:00
/* prototype */
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
static uiBlock * ui_block_func_MENU_ITEM ( bContext * C , uiPopupBlockHandle * handle , void * arg_info ) ;
2009-01-25 20:22:05 +00:00
2009-04-22 18:39:44 +00:00
struct uiPopupMenu {
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
uiBlock * block ;
2009-04-22 18:39:44 +00:00
uiLayout * layout ;
2009-01-25 20:22:05 +00:00
} ;
typedef struct uiMenuInfo {
2009-04-22 18:39:44 +00:00
uiPopupMenu * pup ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
int mx , my , popup , slideout ;
2009-01-25 20:22:05 +00:00
int startx , starty ;
} uiMenuInfo ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
/************************ Menu Definitions to uiBlocks ***********************/
2009-01-25 20:22:05 +00:00
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
const char * ui_menu_enumpropname ( char * opname , const char * propname , int retval )
2009-01-25 20:22:05 +00:00
{
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
wmOperatorType * ot = WM_operatortype_find ( opname ) ;
PointerRNA ptr ;
PropertyRNA * prop ;
if ( ! ot | | ! ot - > srna )
return " " ;
2009-01-25 20:22:05 +00:00
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
RNA_pointer_create ( NULL , ot - > srna , NULL , & ptr ) ;
prop = RNA_struct_find_property ( & ptr , propname ) ;
2009-01-25 20:22:05 +00:00
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
if ( prop ) {
const EnumPropertyItem * item ;
int totitem , i ;
RNA_property_enum_items ( & ptr , prop , & item , & totitem ) ;
for ( i = 0 ; i < totitem ; i + + ) {
if ( item [ i ] . value = = retval )
return item [ i ] . name ;
}
}
2009-01-25 20:22:05 +00:00
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
return " " ;
2009-01-25 20:22:05 +00:00
}
2009-02-13 17:37:01 +00:00
typedef struct MenuItemLevel {
int opcontext ;
char * opname ;
char * propname ;
PointerRNA rnapoin ;
} MenuItemLevel ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
static uiBlock * ui_block_func_MENU_ITEM ( bContext * C , uiPopupBlockHandle * handle , void * arg_info )
2009-01-25 20:22:05 +00:00
{
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiBlock * block ;
uiMenuInfo * info = arg_info ;
2009-04-22 18:39:44 +00:00
uiPopupMenu * pup ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
ScrArea * sa ;
ARegion * ar ;
2009-01-25 20:22:05 +00:00
2009-04-22 18:39:44 +00:00
pup = info - > pup ;
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
block = pup - > block ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
/* block stuff first, need to know the font */
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
uiBlockSetRegion ( block , handle - > region ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
block - > direction = UI_DOWN ;
2009-01-25 20:22:05 +00:00
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
uiBlockLayoutResolve ( C , block , NULL , NULL ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
if ( info - > popup ) {
uiBlockSetFlag ( block , UI_BLOCK_LOOP | UI_BLOCK_REDRAW | UI_BLOCK_NUMSELECT | UI_BLOCK_RET_1 ) ;
uiBlockSetDirection ( block , UI_DOWN ) ;
2009-02-04 11:52:16 +00:00
/* here we set an offset for the mouse position */
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
uiMenuPopupBoundsBlock ( block , 1 , 0 , 1.5 * MENU_BUTTON_HEIGHT ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
}
else {
/* for a header menu we set the direction automatic */
if ( ! info - > slideout ) {
sa = CTX_wm_area ( C ) ;
ar = CTX_wm_region ( C ) ;
if ( sa & & sa - > headertype = = HEADERDOWN ) {
if ( ar & & ar - > regiontype = = RGN_TYPE_HEADER ) {
uiBlockSetDirection ( block , UI_TOP ) ;
uiBlockFlipOrder ( block ) ;
}
}
}
uiTextBoundsBlock ( block , 50 ) ;
}
/* if menu slides out of other menu, override direction */
if ( info - > slideout )
uiBlockSetDirection ( block , UI_RIGHT ) ;
2009-01-25 20:22:05 +00:00
uiEndBlock ( C , block ) ;
return block ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiPopupBlockHandle * ui_popup_menu_create ( bContext * C , ARegion * butregion , uiBut * but , uiMenuCreateFunc menu_func , void * arg )
{
2009-04-27 18:05:58 +00:00
uiStyle * style = U . uistyles . first ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiPopupBlockHandle * handle ;
2009-04-22 18:39:44 +00:00
uiPopupMenu * pup ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
uiMenuInfo info ;
2009-04-22 18:39:44 +00:00
pup = MEM_callocN ( sizeof ( uiPopupMenu ) , " menu dummy " ) ;
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
pup - > block = uiBeginBlock ( C , NULL , " ui_popup_menu_create " , UI_EMBOSSP ) ;
pup - > layout = uiBlockLayout ( pup - > block , UI_LAYOUT_VERTICAL , UI_LAYOUT_MENU , 0 , 0 , 200 , 0 , style ) ;
2009-05-28 23:37:55 +00:00
uiLayoutSetOperatorContext ( pup - > layout , WM_OP_INVOKE_REGION_WIN ) ;
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
/* create in advance so we can let buttons point to retval already */
pup - > block - > handle = MEM_callocN ( sizeof ( uiPopupBlockHandle ) , " uiPopupBlockHandle " ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
2009-04-22 18:39:44 +00:00
menu_func ( C , pup - > layout , arg ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
memset ( & info , 0 , sizeof ( info ) ) ;
2009-04-22 18:39:44 +00:00
info . pup = pup ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
info . slideout = ( but & & ( but - > block - > flag & UI_BLOCK_LOOP ) ) ;
handle = ui_popup_block_create ( C , butregion , but , NULL , ui_block_func_MENU_ITEM , & info ) ;
2009-04-22 18:39:44 +00:00
MEM_freeN ( pup ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
return handle ;
}
/*************************** Menu Creating API **************************/
/*************************** Popup Menu API **************************/
/* only return handler, and set optional title */
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
uiPopupMenu * uiPupMenuBegin ( bContext * C , const char * title , int icon )
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
{
2009-04-27 18:05:58 +00:00
uiStyle * style = U . uistyles . first ;
2009-04-22 18:39:44 +00:00
uiPopupMenu * pup = MEM_callocN ( sizeof ( uiPopupMenu ) , " menu start " ) ;
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
uiBut * but ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
pup - > block = uiBeginBlock ( C , NULL , " uiPupMenuBegin " , UI_EMBOSSP ) ;
pup - > layout = uiBlockLayout ( pup - > block , UI_LAYOUT_VERTICAL , UI_LAYOUT_MENU , 0 , 0 , 200 , 0 , style ) ;
2009-05-28 23:37:55 +00:00
uiLayoutSetOperatorContext ( pup - > layout , WM_OP_EXEC_REGION_WIN ) ;
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
/* create in advance so we can let buttons point to retval already */
pup - > block - > handle = MEM_callocN ( sizeof ( uiPopupBlockHandle ) , " uiPopupBlockHandle " ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
/* create title button */
if ( title & & title [ 0 ] ) {
char titlestr [ 256 ] ;
if ( icon ) {
sprintf ( titlestr , " %s " , title ) ;
uiDefIconTextBut ( pup - > block , LABEL , 0 , icon , titlestr , 0 , 0 , 200 , MENU_BUTTON_HEIGHT , NULL , 0.0 , 0.0 , 0 , 0 , " " ) ;
}
else {
but = uiDefBut ( pup - > block , LABEL , 0 , ( char * ) title , 0 , 0 , 200 , MENU_BUTTON_HEIGHT , NULL , 0.0 , 0.0 , 0 , 0 , " " ) ;
but - > flag = UI_TEXT_LEFT ;
}
//uiDefBut(block, SEPR, 0, "", startx, (short)(starty+height)-MENU_SEPR_HEIGHT, width, MENU_SEPR_HEIGHT, NULL, 0.0, 0.0, 0, 0, "");
}
2009-04-22 18:39:44 +00:00
return pup ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
}
/* set the whole structure to work */
2009-04-22 18:39:44 +00:00
void uiPupMenuEnd ( bContext * C , uiPopupMenu * pup )
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
{
wmWindow * window = CTX_wm_window ( C ) ;
uiMenuInfo info ;
uiPopupBlockHandle * menu ;
memset ( & info , 0 , sizeof ( info ) ) ;
info . popup = 1 ;
info . mx = window - > eventstate - > x ;
info . my = window - > eventstate - > y ;
2009-04-22 18:39:44 +00:00
info . pup = pup ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
menu = ui_popup_block_create ( C , NULL , NULL , NULL , ui_block_func_MENU_ITEM , & info ) ;
menu - > popup = 1 ;
2009-03-25 20:49:15 +00:00
UI_add_popup_handlers ( C , & window - > handlers , menu ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
WM_event_add_mousemove ( C ) ;
2009-04-22 18:39:44 +00:00
MEM_freeN ( pup ) ;
}
uiLayout * uiPupMenuLayout ( uiPopupMenu * pup )
{
return pup - > layout ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
}
2009-02-08 19:15:59 +00:00
/* ************** standard pupmenus *************** */
/* this one can called with operatortype name and operators */
static uiPopupBlockHandle * ui_pup_menu ( bContext * C , int maxrow , uiMenuHandleFunc func , void * arg , char * str , . . . )
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
{
wmWindow * window = CTX_wm_window ( C ) ;
uiPupMenuInfo info ;
uiPopupBlockHandle * menu ;
memset ( & info , 0 , sizeof ( info ) ) ;
info . mx = window - > eventstate - > x ;
info . my = window - > eventstate - > y ;
info . maxrow = maxrow ;
info . instr = str ;
menu = ui_popup_block_create ( C , NULL , NULL , NULL , ui_block_func_PUPMENU , & info ) ;
menu - > popup = 1 ;
2009-03-25 20:49:15 +00:00
UI_add_popup_handlers ( C , & window - > handlers , menu ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
WM_event_add_mousemove ( C ) ;
menu - > popup_func = func ;
menu - > popup_arg = arg ;
2009-02-08 19:15:59 +00:00
return menu ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
}
2009-02-08 19:15:59 +00:00
static void operator_name_cb ( bContext * C , void * arg , int retval )
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
{
const char * opname = arg ;
if ( opname & & retval > 0 )
WM_operator_name_call ( C , opname , WM_OP_EXEC_DEFAULT , NULL ) ;
}
2009-02-08 19:15:59 +00:00
static void vconfirm_opname ( bContext * C , char * opname , char * title , char * itemfmt , va_list ap )
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
{
char * s , buf [ 512 ] ;
s = buf ;
if ( title ) s + = sprintf ( s , " %s%%t| " , title ) ;
vsprintf ( s , itemfmt , ap ) ;
2009-02-08 19:15:59 +00:00
ui_pup_menu ( C , 0 , operator_name_cb , opname , buf ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
}
2009-02-08 19:15:59 +00:00
static void operator_cb ( bContext * C , void * arg , int retval )
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
{
2009-02-08 19:15:59 +00:00
wmOperator * op = arg ;
if ( op & & retval > 0 )
WM_operator_call ( C , op ) ;
else
WM_operator_free ( op ) ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
2009-02-08 19:15:59 +00:00
static void confirm_cancel_operator ( void * opv )
{
WM_operator_free ( opv ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
}
2009-02-08 19:15:59 +00:00
static void confirm_operator ( bContext * C , wmOperator * op , char * title , char * item )
{
uiPopupBlockHandle * handle ;
char * s , buf [ 512 ] ;
s = buf ;
if ( title ) s + = sprintf ( s , " %s%%t|%s " , title , item ) ;
handle = ui_pup_menu ( C , 0 , operator_cb , op , buf ) ;
handle - > cancel_func = confirm_cancel_operator ;
}
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
void uiPupMenuOkee ( bContext * C , char * opname , char * str , . . . )
{
va_list ap ;
char titlestr [ 256 ] ;
2009-05-23 07:19:31 +00:00
sprintf ( titlestr , " OK? %%i%d " , ICON_QUESTION ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
va_start ( ap , str ) ;
2009-02-08 19:15:59 +00:00
vconfirm_opname ( C , opname , titlestr , str , ap ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
va_end ( ap ) ;
}
2009-02-08 19:15:59 +00:00
void uiPupMenuSaveOver ( bContext * C , wmOperator * op , char * filename )
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
{
size_t len = strlen ( filename ) ;
if ( len = = 0 )
return ;
if ( filename [ len - 1 ] = = ' / ' | | filename [ len - 1 ] = = ' \\ ' ) {
uiPupMenuError ( C , " Cannot overwrite a directory " ) ;
2009-02-08 19:15:59 +00:00
WM_operator_free ( op ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
return ;
}
2009-02-10 09:49:36 +00:00
if ( BLI_exists ( filename ) = = 0 )
operator_cb ( C , op , 1 ) ;
else
confirm_operator ( C , op , " Save over " , filename ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
}
void uiPupMenuNotice ( bContext * C , char * str , . . . )
{
va_list ap ;
va_start ( ap , str ) ;
2009-02-08 19:15:59 +00:00
vconfirm_opname ( C , NULL , NULL , str , ap ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
va_end ( ap ) ;
}
void uiPupMenuError ( bContext * C , char * str , . . . )
{
va_list ap ;
char nfmt [ 256 ] ;
char titlestr [ 256 ] ;
sprintf ( titlestr , " Error %%i%d " , ICON_ERROR ) ;
sprintf ( nfmt , " %s " , str ) ;
va_start ( ap , str ) ;
2009-02-08 19:15:59 +00:00
vconfirm_opname ( C , NULL , titlestr , nfmt , ap ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
va_end ( ap ) ;
}
void uiPupMenuReports ( bContext * C , ReportList * reports )
{
Report * report ;
DynStr * ds ;
char * str ;
if ( ! reports | | ! reports - > list . first )
return ;
if ( ! CTX_wm_window ( C ) )
return ;
ds = BLI_dynstr_new ( ) ;
for ( report = reports - > list . first ; report ; report = report - > next ) {
if ( report - > type > = RPT_ERROR )
BLI_dynstr_appendf ( ds , " Error %%i%d%%t|%s " , ICON_ERROR , report - > message ) ;
else if ( report - > type > = RPT_WARNING )
BLI_dynstr_appendf ( ds , " Warning %%i%d%%t|%s " , ICON_ERROR , report - > message ) ;
}
str = BLI_dynstr_get_cstring ( ds ) ;
2009-02-08 19:15:59 +00:00
ui_pup_menu ( C , 0 , NULL , NULL , str ) ;
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
MEM_freeN ( str ) ;
BLI_dynstr_free ( ds ) ;
}
2009-01-25 20:22:05 +00:00
2009-02-04 11:52:16 +00:00
/*************************** Popup Block API **************************/
void uiPupBlockO ( bContext * C , uiBlockCreateFunc func , void * arg , char * opname , int opcontext )
{
wmWindow * window = CTX_wm_window ( C ) ;
uiPopupBlockHandle * handle ;
handle = ui_popup_block_create ( C , NULL , NULL , func , NULL , arg ) ;
handle - > popup = 1 ;
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
handle - > optype = ( opname ) ? WM_operatortype_find ( opname ) : NULL ;
2009-02-04 11:52:16 +00:00
handle - > opcontext = opcontext ;
2009-03-25 20:49:15 +00:00
UI_add_popup_handlers ( C , & window - > handlers , handle ) ;
2009-02-04 11:52:16 +00:00
WM_event_add_mousemove ( C ) ;
}
void uiPupBlock ( bContext * C , uiBlockCreateFunc func , void * arg )
{
uiPupBlockO ( C , func , arg , NULL , 0 ) ;
}
2009-05-22 15:02:32 +00:00
void uiPupBlockOperator ( bContext * C , uiBlockCreateFunc func , wmOperator * op , int opcontext )
{
wmWindow * window = CTX_wm_window ( C ) ;
uiPopupBlockHandle * handle ;
handle = ui_popup_block_create ( C , NULL , NULL , func , NULL , op ) ;
handle - > popup = 1 ;
handle - > retvalue = 1 ;
handle - > popup_arg = op ;
handle - > popup_func = operator_cb ;
handle - > cancel_func = confirm_cancel_operator ;
handle - > opcontext = opcontext ;
UI_add_popup_handlers ( C , & window - > handlers , handle ) ;
WM_event_add_mousemove ( C ) ;
}