2011-02-23 10:52:22 +00:00
|
|
|
/*
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
* This program is free software; you can redistribute it and/or
|
|
|
|
* modify it under the terms of the GNU General Public License
|
|
|
|
* as published by the Free Software Foundation; either version 2
|
|
|
|
* of the License, or (at your option) any later version.
|
|
|
|
*
|
|
|
|
* This program is distributed in the hope that it will be useful,
|
|
|
|
* but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
|
|
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
|
|
* GNU General Public License for more details.
|
|
|
|
*
|
|
|
|
* You should have received a copy of the GNU General Public License
|
|
|
|
* along with this program; if not, write to the Free Software Foundation,
|
2010-02-12 13:34:04 +00:00
|
|
|
* Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
*
|
|
|
|
* The Original Code is Copyright (C) 2001-2002 by NaN Holding BV.
|
|
|
|
* All rights reserved.
|
|
|
|
*/
|
|
|
|
|
2019-02-18 08:08:12 +11:00
|
|
|
/** \file
|
|
|
|
* \ingroup edinterface
|
2011-02-27 20:29:51 +00:00
|
|
|
*/
|
|
|
|
|
|
|
|
|
2008-11-14 18:46:57 +00:00
|
|
|
#include <float.h>
|
2010-01-26 17:14:13 +00:00
|
|
|
#include <limits.h>
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
#include <math.h>
|
|
|
|
#include <string.h>
|
2010-12-14 02:38:29 +00:00
|
|
|
#include <ctype.h>
|
2012-09-04 01:23:50 +00:00
|
|
|
#include <stddef.h> /* offsetof() */
|
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
#include "MEM_guardedalloc.h"
|
|
|
|
|
2015-04-21 14:24:13 +10:00
|
|
|
#include "DNA_object_types.h"
|
2009-04-03 23:30:32 +00:00
|
|
|
#include "DNA_scene_types.h"
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
#include "DNA_screen_types.h"
|
|
|
|
#include "DNA_userdef_types.h"
|
2017-11-13 19:43:34 +11:00
|
|
|
#include "DNA_workspace_types.h"
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2009-11-10 20:43:45 +00:00
|
|
|
#include "BLI_math.h"
|
2012-09-03 22:04:14 +00:00
|
|
|
#include "BLI_listbase.h"
|
|
|
|
#include "BLI_string.h"
|
|
|
|
#include "BLI_string_utf8.h"
|
|
|
|
#include "BLI_rect.h"
|
|
|
|
|
2011-01-07 18:36:47 +00:00
|
|
|
#include "BLI_utildefines.h"
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2018-04-17 17:20:34 +02:00
|
|
|
#include "BKE_animsys.h"
|
2008-12-18 02:56:48 +00:00
|
|
|
#include "BKE_context.h"
|
2018-06-08 12:16:37 +02:00
|
|
|
#include "BKE_idprop.h"
|
|
|
|
#include "BKE_main.h"
|
2014-08-26 20:52:07 +10:00
|
|
|
#include "BKE_scene.h"
|
2011-11-14 14:42:47 +00:00
|
|
|
#include "BKE_screen.h"
|
2018-06-08 12:16:37 +02:00
|
|
|
#include "BKE_unit.h"
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2017-08-21 01:39:03 +10:00
|
|
|
#include "GPU_glew.h"
|
2017-03-21 16:08:14 -04:00
|
|
|
#include "GPU_matrix.h"
|
2018-06-27 19:07:23 -06:00
|
|
|
#include "GPU_state.h"
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2009-04-10 14:06:24 +00:00
|
|
|
#include "BLF_api.h"
|
2015-08-16 17:32:01 +10:00
|
|
|
#include "BLT_translation.h"
|
2009-04-10 14:06:24 +00:00
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
#include "UI_interface.h"
|
2018-03-31 19:32:28 +02:00
|
|
|
#include "UI_interface_icons.h"
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2011-09-17 16:57:37 +00:00
|
|
|
#include "IMB_imbuf.h"
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
#include "WM_api.h"
|
|
|
|
#include "WM_types.h"
|
2017-11-13 19:43:34 +11:00
|
|
|
#include "WM_message.h"
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
#include "RNA_access.h"
|
|
|
|
|
2009-08-10 11:58:53 +00:00
|
|
|
#include "BPY_extern.h"
|
|
|
|
|
2017-11-13 19:43:34 +11:00
|
|
|
#include "ED_screen.h"
|
2018-10-03 10:20:16 +02:00
|
|
|
#include "ED_numinput.h"
|
2017-11-13 19:43:34 +11:00
|
|
|
|
Color Management, Stage 2: Switch color pipeline to use OpenColorIO
Replace old color pipeline which was supporting linear/sRGB color spaces
only with OpenColorIO-based pipeline.
This introduces two configurable color spaces:
- Input color space for images and movie clips. This space is used to convert
images/movies from color space in which file is saved to Blender's linear
space (for float images, byte images are not internally converted, only input
space is stored for such images and used later).
This setting could be found in image/clip data block settings.
- Display color space which defines space in which particular display is working.
This settings could be found in scene's Color Management panel.
When render result is being displayed on the screen, apart from converting image
to display space, some additional conversions could happen.
This conversions are:
- View, which defines tone curve applying before display transformation.
These are different ways to view the image on the same display device.
For example it could be used to emulate film view on sRGB display.
- Exposure affects on image exposure before tone map is applied.
- Gamma is post-display gamma correction, could be used to match particular
display gamma.
- RGB curves are user-defined curves which are applying before display
transformation, could be used for different purposes.
All this settings by default are only applying on render result and does not
affect on other images. If some particular image needs to be affected by this
transformation, "View as Render" setting of image data block should be set to
truth. Movie clips are always affected by all display transformations.
This commit also introduces configurable color space in which sequencer is
working. This setting could be found in scene's Color Management panel and
it should be used if such stuff as grading needs to be done in color space
different from sRGB (i.e. when Film view on sRGB display is use, using VD16
space as sequencer's internal space would make grading working in space
which is close to the space using for display).
Some technical notes:
- Image buffer's float buffer is now always in linear space, even if it was
created from 16bit byte images.
- Space of byte buffer is stored in image buffer's rect_colorspace property.
- Profile of image buffer was removed since it's not longer meaningful.
- OpenGL and GLSL is supposed to always work in sRGB space. It is possible
to support other spaces, but it's quite large project which isn't so
much important.
- Legacy Color Management option disabled is emulated by using None display.
It could have some regressions, but there's no clear way to avoid them.
- If OpenColorIO is disabled on build time, it should make blender behaving
in the same way as previous release with color management enabled.
More details could be found at this page (more details would be added soon):
http://wiki.blender.org/index.php/Dev:Ref/Release_Notes/2.64/Color_Management
--
Thanks to Xavier Thomas, Lukas Toene for initial work on OpenColorIO
integration and to Brecht van Lommel for some further development and code/
usecase review!
2012-09-15 10:05:07 +00:00
|
|
|
#include "IMB_colormanagement.h"
|
|
|
|
|
2018-04-17 17:20:34 +02:00
|
|
|
#include "DEG_depsgraph_query.h"
|
|
|
|
|
2008-12-26 13:11:04 +00:00
|
|
|
#include "interface_intern.h"
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2018-06-11 14:33:53 +02:00
|
|
|
/* prototypes. */
|
|
|
|
static void ui_but_to_pixelrect(struct rcti *rect, const struct ARegion *ar, struct uiBlock *block, struct uiBut *but);
|
2018-11-29 17:32:35 +11:00
|
|
|
static void ui_def_but_rna__menu(bContext *UNUSED(C), uiLayout *layout, void *but_p);
|
2018-06-11 14:33:53 +02:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
/* avoid unneeded calls to ui_but_value_get */
|
2011-06-16 06:00:02 +00:00
|
|
|
#define UI_BUT_VALUE_UNSET DBL_MAX
|
2014-11-09 21:20:40 +01:00
|
|
|
#define UI_GET_BUT_VALUE_INIT(_but, _value) if (_value == DBL_MAX) { (_value) = ui_but_value_get(_but); } (void)0
|
2011-06-16 06:00:02 +00:00
|
|
|
|
2014-02-10 12:52:35 +11:00
|
|
|
#define B_NOP -1
|
|
|
|
|
2015-05-31 14:20:03 +10:00
|
|
|
/**
|
|
|
|
* a full doc with API notes can be found in 'blender/doc/guides/interface_API.txt'
|
2018-05-23 10:47:12 +02:00
|
|
|
*
|
2015-05-31 14:20:03 +10:00
|
|
|
* `uiBlahBlah()` external function.
|
|
|
|
* `ui_blah_blah()` internal function.
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
*/
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
static void ui_but_free(const bContext *C, uiBut *but);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
static bool ui_but_is_unit_radians_ex(UnitSettings *unit, const int unit_type)
|
2014-04-24 17:17:55 +02:00
|
|
|
{
|
|
|
|
return (unit->system_rotation == USER_UNIT_ROT_RADIANS && unit_type == PROP_UNIT_ROTATION);
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
static bool ui_but_is_unit_radians(const uiBut *but)
|
2014-04-24 17:17:55 +02:00
|
|
|
{
|
|
|
|
UnitSettings *unit = but->block->unit;
|
2014-11-09 21:20:40 +01:00
|
|
|
const int unit_type = UI_but_unit_type_get(but);
|
2014-04-24 17:17:55 +02:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
return ui_but_is_unit_radians_ex(unit, unit_type);
|
2014-04-24 17:17:55 +02:00
|
|
|
}
|
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
/* ************* window matrix ************** */
|
|
|
|
|
|
|
|
void ui_block_to_window_fl(const ARegion *ar, uiBlock *block, float *x, float *y)
|
|
|
|
{
|
|
|
|
float gx, gy;
|
|
|
|
int sx, sy, getsizex, getsizey;
|
|
|
|
|
2012-09-15 11:48:20 +00:00
|
|
|
getsizex = BLI_rcti_size_x(&ar->winrct) + 1;
|
|
|
|
getsizey = BLI_rcti_size_y(&ar->winrct) + 1;
|
2012-03-30 01:51:25 +00:00
|
|
|
sx = ar->winrct.xmin;
|
|
|
|
sy = ar->winrct.ymin;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
gx = *x;
|
|
|
|
gy = *y;
|
2008-12-26 13:11:04 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (block->panel) {
|
2008-12-26 13:11:04 +00:00
|
|
|
gx += block->panel->ofsx;
|
|
|
|
gy += block->panel->ofsy;
|
|
|
|
}
|
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
*x = ((float)sx) + ((float)getsizex) * (0.5f + 0.5f * (gx * block->winmat[0][0] + gy * block->winmat[1][0] + block->winmat[3][0]));
|
|
|
|
*y = ((float)sy) + ((float)getsizey) * (0.5f + 0.5f * (gx * block->winmat[0][1] + gy * block->winmat[1][1] + block->winmat[3][1]));
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
|
|
|
void ui_block_to_window(const ARegion *ar, uiBlock *block, int *x, int *y)
|
|
|
|
{
|
|
|
|
float fx, fy;
|
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
fx = *x;
|
|
|
|
fy = *y;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
ui_block_to_window_fl(ar, block, &fx, &fy);
|
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
*x = (int)(fx + 0.5f);
|
|
|
|
*y = (int)(fy + 0.5f);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2013-06-13 09:12:53 +00:00
|
|
|
void ui_block_to_window_rctf(const ARegion *ar, uiBlock *block, rctf *rct_dst, const rctf *rct_src)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2013-06-13 09:12:53 +00:00
|
|
|
*rct_dst = *rct_src;
|
|
|
|
ui_block_to_window_fl(ar, block, &rct_dst->xmin, &rct_dst->ymin);
|
|
|
|
ui_block_to_window_fl(ar, block, &rct_dst->xmax, &rct_dst->ymax);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2019-02-11 15:34:19 +11:00
|
|
|
float ui_block_to_window_scale(const ARegion *ar, uiBlock *block)
|
|
|
|
{
|
|
|
|
/* We could have function for this to avoid dummy arg. */
|
|
|
|
float dummy_x;
|
|
|
|
float min_y = 0, max_y = 1;
|
2019-03-12 15:17:26 +11:00
|
|
|
dummy_x = 0.0f;
|
2019-02-11 15:34:19 +11:00
|
|
|
ui_block_to_window_fl(ar, block, &dummy_x, &min_y);
|
2019-03-12 15:17:26 +11:00
|
|
|
dummy_x = 0.0f;
|
2019-02-11 15:34:19 +11:00
|
|
|
ui_block_to_window_fl(ar, block, &dummy_x, &max_y);
|
|
|
|
return max_y - min_y;
|
|
|
|
}
|
|
|
|
|
2019-01-15 23:24:20 +11:00
|
|
|
/* for mouse cursor */
|
|
|
|
void ui_window_to_block_fl(const ARegion *ar, uiBlock *block, float *x, float *y)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
|
|
|
float a, b, c, d, e, f, px, py;
|
|
|
|
int sx, sy, getsizex, getsizey;
|
|
|
|
|
2012-09-15 11:48:20 +00:00
|
|
|
getsizex = BLI_rcti_size_x(&ar->winrct) + 1;
|
|
|
|
getsizey = BLI_rcti_size_y(&ar->winrct) + 1;
|
2012-03-30 01:51:25 +00:00
|
|
|
sx = ar->winrct.xmin;
|
|
|
|
sy = ar->winrct.ymin;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
a = 0.5f * ((float)getsizex) * block->winmat[0][0];
|
|
|
|
b = 0.5f * ((float)getsizex) * block->winmat[1][0];
|
|
|
|
c = 0.5f * ((float)getsizex) * (1.0f + block->winmat[3][0]);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
d = 0.5f * ((float)getsizey) * block->winmat[0][1];
|
|
|
|
e = 0.5f * ((float)getsizey) * block->winmat[1][1];
|
|
|
|
f = 0.5f * ((float)getsizey) * (1.0f + block->winmat[3][1]);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
px = *x - sx;
|
|
|
|
py = *y - sy;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
*y = (a * (py - f) + d * (c - px)) / (a * e - d * b);
|
|
|
|
*x = (px - b * (*y) - c) / a;
|
2008-12-26 13:11:04 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (block->panel) {
|
2008-12-26 13:11:04 +00:00
|
|
|
*x -= block->panel->ofsx;
|
|
|
|
*y -= block->panel->ofsy;
|
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
|
|
|
void ui_window_to_block(const ARegion *ar, uiBlock *block, int *x, int *y)
|
|
|
|
{
|
|
|
|
float fx, fy;
|
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
fx = *x;
|
|
|
|
fy = *y;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
ui_window_to_block_fl(ar, block, &fx, &fy);
|
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
*x = (int)(fx + 0.5f);
|
|
|
|
*y = (int)(fy + 0.5f);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2008-12-03 20:57:23 +00:00
|
|
|
void ui_window_to_region(const ARegion *ar, int *x, int *y)
|
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
*x -= ar->winrct.xmin;
|
|
|
|
*y -= ar->winrct.ymin;
|
2008-12-03 20:57:23 +00:00
|
|
|
}
|
|
|
|
|
2014-12-07 00:58:17 +01:00
|
|
|
void ui_region_to_window(const ARegion *ar, int *x, int *y)
|
|
|
|
{
|
|
|
|
*x += ar->winrct.xmin;
|
|
|
|
*y += ar->winrct.ymin;
|
|
|
|
}
|
2015-08-18 14:16:58 +10:00
|
|
|
|
2018-06-11 14:33:53 +02:00
|
|
|
static void ui_update_flexible_spacing(const ARegion *region, uiBlock *block)
|
|
|
|
{
|
|
|
|
int sepr_flex_len = 0;
|
|
|
|
for (uiBut *but = block->buttons.first; but; but = but->next) {
|
|
|
|
if (but->type == UI_BTYPE_SEPR_SPACER) {
|
|
|
|
sepr_flex_len++;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
if (sepr_flex_len == 0) {
|
|
|
|
return;
|
|
|
|
}
|
|
|
|
|
|
|
|
rcti rect;
|
|
|
|
ui_but_to_pixelrect(&rect, region, block, block->buttons.last);
|
2018-06-13 15:17:56 +02:00
|
|
|
const float buttons_width = (float)rect.xmax + UI_HEADER_OFFSET;
|
2018-06-11 14:33:53 +02:00
|
|
|
const float region_width = (float)region->sizex * U.dpi_fac;
|
|
|
|
|
|
|
|
if (region_width <= buttons_width) {
|
|
|
|
return;
|
|
|
|
}
|
|
|
|
|
2018-06-14 11:29:44 +02:00
|
|
|
/* We could get rid of this loop if we agree on a max number of spacer */
|
|
|
|
int *spacers_pos = alloca(sizeof(*spacers_pos) * (size_t)sepr_flex_len);
|
|
|
|
int i = 0;
|
|
|
|
for (uiBut *but = block->buttons.first; but; but = but->next) {
|
|
|
|
if (but->type == UI_BTYPE_SEPR_SPACER) {
|
|
|
|
ui_but_to_pixelrect(&rect, region, block, but);
|
|
|
|
spacers_pos[i] = rect.xmax + UI_HEADER_OFFSET;
|
|
|
|
i++;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
const float segment_width = region_width / (float)sepr_flex_len;
|
|
|
|
float offset = 0, remaining_space = region_width - buttons_width;
|
|
|
|
i = 0;
|
2018-06-11 14:33:53 +02:00
|
|
|
for (uiBut *but = block->buttons.first; but; but = but->next) {
|
|
|
|
BLI_rctf_translate(&but->rect, offset, 0);
|
|
|
|
if (but->type == UI_BTYPE_SEPR_SPACER) {
|
2018-06-14 11:29:44 +02:00
|
|
|
/* How much the next block overlap with the current segment */
|
2018-06-14 11:37:24 +02:00
|
|
|
int overlap = (
|
|
|
|
(i == sepr_flex_len - 1) ?
|
|
|
|
buttons_width - spacers_pos[i] :
|
|
|
|
(spacers_pos[i + 1] - spacers_pos[i]) / 2);
|
|
|
|
int segment_end = segment_width * (i + 1);
|
2018-06-14 11:29:44 +02:00
|
|
|
int spacer_end = segment_end - overlap;
|
|
|
|
int spacer_sta = spacers_pos[i] + offset;
|
|
|
|
if (spacer_end > spacer_sta) {
|
|
|
|
float step = min_ff(remaining_space, spacer_end - spacer_sta);
|
|
|
|
remaining_space -= step;
|
|
|
|
offset += step;
|
|
|
|
}
|
|
|
|
i++;
|
2018-06-11 14:33:53 +02:00
|
|
|
}
|
|
|
|
}
|
|
|
|
ui_block_bounds_calc(block);
|
|
|
|
}
|
|
|
|
|
2018-04-29 12:24:08 +02:00
|
|
|
static void ui_update_window_matrix(const wmWindow *window, const ARegion *region, uiBlock *block)
|
|
|
|
{
|
|
|
|
/* window matrix and aspect */
|
|
|
|
if (region && region->visible) {
|
|
|
|
/* Get projection matrix which includes View2D translation and zoom. */
|
2018-07-15 15:27:15 +02:00
|
|
|
GPU_matrix_projection_get(block->winmat);
|
2018-04-29 12:24:08 +02:00
|
|
|
block->aspect = 2.0f / fabsf(region->winx * block->winmat[0][0]);
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
/* No subwindow created yet, for menus for example, so we use the main
|
|
|
|
* window instead, since buttons are created there anyway. */
|
|
|
|
int width = WM_window_pixels_x(window);
|
|
|
|
int height = WM_window_pixels_y(window);
|
|
|
|
rcti winrct = {0, width - 1, 0, height - 1};
|
|
|
|
|
|
|
|
wmGetProjectionMatrix(block->winmat, &winrct);
|
|
|
|
block->aspect = 2.0f / fabsf(width * block->winmat[0][0]);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2015-08-18 14:16:58 +10:00
|
|
|
/**
|
|
|
|
* Popups will add a margin to #ARegion.winrct for shadow,
|
|
|
|
* for interactivity (point-inside tests for eg), we want the winrct without the margin added.
|
|
|
|
*/
|
|
|
|
void ui_region_winrct_get_no_margin(const struct ARegion *ar, struct rcti *r_rect)
|
|
|
|
{
|
|
|
|
uiBlock *block = ar->uiblocks.first;
|
2015-08-19 20:53:16 +02:00
|
|
|
if (block && (block->flag & UI_BLOCK_LOOP) && (block->flag & UI_BLOCK_RADIAL) == 0) {
|
2015-08-18 14:16:58 +10:00
|
|
|
BLI_rcti_rctf_copy_floor(r_rect, &block->rect);
|
|
|
|
BLI_rcti_translate(r_rect, ar->winrct.xmin, ar->winrct.ymin);
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
*r_rect = ar->winrct;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
/* ******************* block calc ************************* */
|
|
|
|
|
2018-05-13 18:23:21 +02:00
|
|
|
void UI_block_translate(uiBlock *block, int x, int y)
|
2009-01-28 23:29:27 +00:00
|
|
|
{
|
2012-08-18 16:53:46 +00:00
|
|
|
uiBut *but;
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
|
2012-08-18 16:53:46 +00:00
|
|
|
for (but = block->buttons.first; but; but = but->next) {
|
|
|
|
BLI_rctf_translate(&but->rect, x, y);
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
}
|
|
|
|
|
2012-08-18 16:53:46 +00:00
|
|
|
BLI_rctf_translate(&block->rect, x, y);
|
2009-01-28 23:29:27 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
static void ui_block_bounds_calc_text(uiBlock *block, float offset)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2014-11-09 21:20:40 +01:00
|
|
|
uiStyle *style = UI_style_get();
|
2014-04-15 16:49:49 +02:00
|
|
|
uiBut *bt, *init_col_bt, *col_bt;
|
|
|
|
int i = 0, j, x1addval = offset;
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
UI_fontstyle_set(&style->widget);
|
2014-04-15 16:49:49 +02:00
|
|
|
|
|
|
|
for (init_col_bt = bt = block->buttons.first; bt; bt = bt->next) {
|
2018-06-11 14:33:53 +02:00
|
|
|
if (!ELEM(bt->type, UI_BTYPE_SEPR, UI_BTYPE_SEPR_LINE, UI_BTYPE_SEPR_SPACER)) {
|
2013-12-02 20:33:45 +11:00
|
|
|
j = BLF_width(style->widget.uifont_id, bt->drawstr, sizeof(bt->drawstr));
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2014-04-15 16:49:49 +02:00
|
|
|
if (j > i)
|
|
|
|
i = j;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2009-08-21 02:51:56 +00:00
|
|
|
|
2014-04-15 16:49:49 +02:00
|
|
|
if (bt->next && bt->rect.xmin < bt->next->rect.xmin) {
|
2016-01-17 13:40:30 +11:00
|
|
|
/* End of this column, and it's not the last one. */
|
2014-04-15 16:49:49 +02:00
|
|
|
for (col_bt = init_col_bt; col_bt->prev != bt; col_bt = col_bt->next) {
|
|
|
|
col_bt->rect.xmin = x1addval;
|
|
|
|
col_bt->rect.xmax = x1addval + i + block->bounds;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update(col_bt); /* clips text again */
|
2014-04-15 16:49:49 +02:00
|
|
|
}
|
2009-08-21 02:51:56 +00:00
|
|
|
|
2014-04-15 16:49:49 +02:00
|
|
|
/* And we prepare next column. */
|
2012-03-30 01:51:25 +00:00
|
|
|
x1addval += i + block->bounds;
|
2014-04-15 16:49:49 +02:00
|
|
|
i = 0;
|
|
|
|
init_col_bt = col_bt;
|
2013-12-18 19:33:54 +01:00
|
|
|
}
|
2014-04-15 16:49:49 +02:00
|
|
|
}
|
|
|
|
|
|
|
|
/* Last column. */
|
|
|
|
for (col_bt = init_col_bt; col_bt; col_bt = col_bt->next) {
|
|
|
|
col_bt->rect.xmin = x1addval;
|
|
|
|
col_bt->rect.xmax = max_ff(x1addval + i + block->bounds, offset + block->minbounds);
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update(col_bt); /* clips text again */
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void ui_block_bounds_calc(uiBlock *block)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
|
|
|
uiBut *bt;
|
|
|
|
int xof;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2014-02-08 06:07:10 +11:00
|
|
|
if (BLI_listbase_is_empty(&block->buttons)) {
|
2012-03-24 06:38:07 +00:00
|
|
|
if (block->panel) {
|
2012-08-18 16:53:46 +00:00
|
|
|
block->rect.xmin = 0.0; block->rect.xmax = block->panel->sizex;
|
|
|
|
block->rect.ymin = 0.0; block->rect.ymax = block->panel->sizey;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
else {
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
BLI_rctf_init_minmax(&block->rect);
|
2009-08-21 02:51:56 +00:00
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
for (bt = block->buttons.first; bt; bt = bt->next) {
|
|
|
|
BLI_rctf_union(&block->rect, &bt->rect);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2012-08-18 16:53:46 +00:00
|
|
|
block->rect.xmin -= block->bounds;
|
|
|
|
block->rect.ymin -= block->bounds;
|
|
|
|
block->rect.xmax += block->bounds;
|
|
|
|
block->rect.ymax += block->bounds;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2012-10-23 13:28:22 +00:00
|
|
|
block->rect.xmax = block->rect.xmin + max_ff(BLI_rctf_size_x(&block->rect), block->minbounds);
|
2009-08-21 02:51:56 +00:00
|
|
|
|
2018-05-23 10:47:12 +02:00
|
|
|
/* hardcoded exception... but that one is annoying with larger safety */
|
2012-03-30 01:51:25 +00:00
|
|
|
bt = block->buttons.first;
|
2015-01-26 16:03:11 +01:00
|
|
|
if (bt && STREQLEN(bt->str, "ERROR", 5)) xof = 10;
|
2012-03-30 01:51:25 +00:00
|
|
|
else xof = 40;
|
|
|
|
|
2012-08-18 16:53:46 +00:00
|
|
|
block->safety.xmin = block->rect.xmin - xof;
|
|
|
|
block->safety.ymin = block->rect.ymin - xof;
|
|
|
|
block->safety.xmax = block->rect.xmax + xof;
|
|
|
|
block->safety.ymax = block->rect.ymax + xof;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
static void ui_block_bounds_calc_centered(wmWindow *window, uiBlock *block)
|
2009-11-23 13:58:55 +00:00
|
|
|
{
|
|
|
|
int xmax, ymax;
|
|
|
|
int startx, starty;
|
|
|
|
int width, height;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2009-11-30 14:10:46 +00:00
|
|
|
/* note: this is used for the splash where window bounds event has not been
|
|
|
|
* updated by ghost, get the window bounds from ghost directly */
|
|
|
|
|
Holiday coding log :)
Nice formatted version (pictures soon):
http://wiki.blender.org/index.php/Dev:Ref/Release_Notes/2.66/Usability
Short list of main changes:
- Transparent region option (over main region), added code to blend in/out such panels.
- Min size window now 640 x 480
- Fixed DPI for ui - lots of cleanup and changes everywhere. Icon image need correct size still, layer-in-use icon needs remake.
- Macbook retina support, use command line --no-native-pixels to disable it
- Timeline Marker label was drawing wrong
- Trackpad and magic mouse: supports zoom (hold ctrl)
- Fix for splash position: removed ghost function and made window size update after creation immediate
- Fast undo buffer save now adds UI as well. Could be checked for regular file save even...
Quit.blend and temp file saving use this now.
- Dixed filename in window on reading quit.blend or temp saves, and they now add a warning in window title: "(Recovered)"
- New Userpref option "Keep Session" - this always saves quit.blend, and loads on start.
This allows keeping UI and data without actual saves, until you actually save.
When you load startup.blend and quit, it recognises the quit.blend as a startup (no file name in header)
- Added 3D view copy/paste buffers (selected objects). Shortcuts ctrl-c, ctrl-v (OSX, cmd-c, cmd-v).
Coded partial file saving for it. Could be used for other purposes. Todo: use OS clipboards.
- User preferences (themes, keymaps, user settings) now can be saved as a separate file.
Old option is called "Save Startup File" the new one "Save User Settings".
To visualise this difference, the 'save startup file' button has been removed from user preferences window. That option is available as CTRL+U and in File menu still.
- OSX: fixed bug that stopped giving mouse events outside window.
This also fixes "Continuous Grab" for OSX. (error since 2009)
2012-12-12 18:58:11 +00:00
|
|
|
xmax = WM_window_pixels_x(window);
|
|
|
|
ymax = WM_window_pixels_y(window);
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_block_bounds_calc(block);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2012-09-15 11:48:20 +00:00
|
|
|
width = BLI_rctf_size_x(&block->rect);
|
|
|
|
height = BLI_rctf_size_y(&block->rect);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2009-11-23 13:58:55 +00:00
|
|
|
startx = (xmax * 0.5f) - (width * 0.5f);
|
|
|
|
starty = (ymax * 0.5f) - (height * 0.5f);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2018-05-13 18:23:21 +02:00
|
|
|
UI_block_translate(block, startx - block->rect.xmin, starty - block->rect.ymin);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2009-11-23 13:58:55 +00:00
|
|
|
/* now recompute bounds and safety */
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_block_bounds_calc(block);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2009-11-23 13:58:55 +00:00
|
|
|
}
|
Pie Menus C code backend.
This commit merges the code in the pie-menu branch.
As per decisions taken the last few days, there are no pie menus
included and there will be an official add-on including overrides of
some keys with pie menus. However, people will now be able to use the
new code in python.
Full Documentation is in http://wiki.blender.org/index.php/Dev:Ref/
Thanks:
Campbell Barton, Dalai Felinto and Ton Roosendaal for the code review
and design comments
Jonathan Williamson, Pawel Lyczkowski, Pablo Vazquez among others for
suggestions during the development.
Special Thanks to Sean Olson, for his support, suggestions, testing and
merciless bugging so that I would finish the pie menu code. Without him
we wouldn't be here. Also to the rest of the developers of the original
python add-on, Patrick Moore and Dan Eicher and finally to Matt Ebb, who
did the research and first implementation and whose code I used to get
started.
2014-08-11 10:39:59 +02:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
static void ui_block_bounds_calc_centered_pie(uiBlock *block)
|
Pie Menus C code backend.
This commit merges the code in the pie-menu branch.
As per decisions taken the last few days, there are no pie menus
included and there will be an official add-on including overrides of
some keys with pie menus. However, people will now be able to use the
new code in python.
Full Documentation is in http://wiki.blender.org/index.php/Dev:Ref/
Thanks:
Campbell Barton, Dalai Felinto and Ton Roosendaal for the code review
and design comments
Jonathan Williamson, Pawel Lyczkowski, Pablo Vazquez among others for
suggestions during the development.
Special Thanks to Sean Olson, for his support, suggestions, testing and
merciless bugging so that I would finish the pie menu code. Without him
we wouldn't be here. Also to the rest of the developers of the original
python add-on, Patrick Moore and Dan Eicher and finally to Matt Ebb, who
did the research and first implementation and whose code I used to get
started.
2014-08-11 10:39:59 +02:00
|
|
|
{
|
|
|
|
const int xy[2] = {
|
|
|
|
block->pie_data.pie_center_spawned[0],
|
2019-02-03 14:01:45 +11:00
|
|
|
block->pie_data.pie_center_spawned[1],
|
Pie Menus C code backend.
This commit merges the code in the pie-menu branch.
As per decisions taken the last few days, there are no pie menus
included and there will be an official add-on including overrides of
some keys with pie menus. However, people will now be able to use the
new code in python.
Full Documentation is in http://wiki.blender.org/index.php/Dev:Ref/
Thanks:
Campbell Barton, Dalai Felinto and Ton Roosendaal for the code review
and design comments
Jonathan Williamson, Pawel Lyczkowski, Pablo Vazquez among others for
suggestions during the development.
Special Thanks to Sean Olson, for his support, suggestions, testing and
merciless bugging so that I would finish the pie menu code. Without him
we wouldn't be here. Also to the rest of the developers of the original
python add-on, Patrick Moore and Dan Eicher and finally to Matt Ebb, who
did the research and first implementation and whose code I used to get
started.
2014-08-11 10:39:59 +02:00
|
|
|
};
|
|
|
|
|
2018-05-13 18:23:21 +02:00
|
|
|
UI_block_translate(block, xy[0], xy[1]);
|
Pie Menus C code backend.
This commit merges the code in the pie-menu branch.
As per decisions taken the last few days, there are no pie menus
included and there will be an official add-on including overrides of
some keys with pie menus. However, people will now be able to use the
new code in python.
Full Documentation is in http://wiki.blender.org/index.php/Dev:Ref/
Thanks:
Campbell Barton, Dalai Felinto and Ton Roosendaal for the code review
and design comments
Jonathan Williamson, Pawel Lyczkowski, Pablo Vazquez among others for
suggestions during the development.
Special Thanks to Sean Olson, for his support, suggestions, testing and
merciless bugging so that I would finish the pie menu code. Without him
we wouldn't be here. Also to the rest of the developers of the original
python add-on, Patrick Moore and Dan Eicher and finally to Matt Ebb, who
did the research and first implementation and whose code I used to get
started.
2014-08-11 10:39:59 +02:00
|
|
|
|
|
|
|
/* now recompute bounds and safety */
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_block_bounds_calc(block);
|
Pie Menus C code backend.
This commit merges the code in the pie-menu branch.
As per decisions taken the last few days, there are no pie menus
included and there will be an official add-on including overrides of
some keys with pie menus. However, people will now be able to use the
new code in python.
Full Documentation is in http://wiki.blender.org/index.php/Dev:Ref/
Thanks:
Campbell Barton, Dalai Felinto and Ton Roosendaal for the code review
and design comments
Jonathan Williamson, Pawel Lyczkowski, Pablo Vazquez among others for
suggestions during the development.
Special Thanks to Sean Olson, for his support, suggestions, testing and
merciless bugging so that I would finish the pie menu code. Without him
we wouldn't be here. Also to the rest of the developers of the original
python add-on, Patrick Moore and Dan Eicher and finally to Matt Ebb, who
did the research and first implementation and whose code I used to get
started.
2014-08-11 10:39:59 +02:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
static void ui_block_bounds_calc_popup(
|
|
|
|
wmWindow *window, uiBlock *block,
|
2017-10-12 14:43:45 +02:00
|
|
|
eBlockBoundsCalc bounds_calc, const int xy[2], int r_xy[2])
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2015-10-17 00:23:57 +11:00
|
|
|
int width, height, oldwidth, oldheight;
|
2017-10-12 14:43:45 +02:00
|
|
|
int oldbounds, xmax, ymax, raw_x, raw_y;
|
2013-02-07 02:03:31 +00:00
|
|
|
const int margin = UI_SCREEN_MARGIN;
|
2015-10-17 00:23:57 +11:00
|
|
|
rcti rect, rect_bounds;
|
|
|
|
int ofs_dummy[2];
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
oldbounds = block->bounds;
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
|
2009-02-04 11:52:16 +00:00
|
|
|
/* compute mouse position with user defined offset */
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_block_bounds_calc(block);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
Holiday coding log :)
Nice formatted version (pictures soon):
http://wiki.blender.org/index.php/Dev:Ref/Release_Notes/2.66/Usability
Short list of main changes:
- Transparent region option (over main region), added code to blend in/out such panels.
- Min size window now 640 x 480
- Fixed DPI for ui - lots of cleanup and changes everywhere. Icon image need correct size still, layer-in-use icon needs remake.
- Macbook retina support, use command line --no-native-pixels to disable it
- Timeline Marker label was drawing wrong
- Trackpad and magic mouse: supports zoom (hold ctrl)
- Fix for splash position: removed ghost function and made window size update after creation immediate
- Fast undo buffer save now adds UI as well. Could be checked for regular file save even...
Quit.blend and temp file saving use this now.
- Dixed filename in window on reading quit.blend or temp saves, and they now add a warning in window title: "(Recovered)"
- New Userpref option "Keep Session" - this always saves quit.blend, and loads on start.
This allows keeping UI and data without actual saves, until you actually save.
When you load startup.blend and quit, it recognises the quit.blend as a startup (no file name in header)
- Added 3D view copy/paste buffers (selected objects). Shortcuts ctrl-c, ctrl-v (OSX, cmd-c, cmd-v).
Coded partial file saving for it. Could be used for other purposes. Todo: use OS clipboards.
- User preferences (themes, keymaps, user settings) now can be saved as a separate file.
Old option is called "Save Startup File" the new one "Save User Settings".
To visualise this difference, the 'save startup file' button has been removed from user preferences window. That option is available as CTRL+U and in File menu still.
- OSX: fixed bug that stopped giving mouse events outside window.
This also fixes "Continuous Grab" for OSX. (error since 2009)
2012-12-12 18:58:11 +00:00
|
|
|
xmax = WM_window_pixels_x(window);
|
|
|
|
ymax = WM_window_pixels_y(window);
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
|
2012-09-15 11:48:20 +00:00
|
|
|
oldwidth = BLI_rctf_size_x(&block->rect);
|
|
|
|
oldheight = BLI_rctf_size_y(&block->rect);
|
2010-02-01 11:13:55 +00:00
|
|
|
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
/* first we ensure wide enough text bounds */
|
2012-03-30 01:51:25 +00:00
|
|
|
if (bounds_calc == UI_BLOCK_BOUNDS_POPUP_MENU) {
|
2012-03-24 06:38:07 +00:00
|
|
|
if (block->flag & UI_BLOCK_LOOP) {
|
Holiday coding log :)
Nice formatted version (pictures soon):
http://wiki.blender.org/index.php/Dev:Ref/Release_Notes/2.66/Usability
Short list of main changes:
- Transparent region option (over main region), added code to blend in/out such panels.
- Min size window now 640 x 480
- Fixed DPI for ui - lots of cleanup and changes everywhere. Icon image need correct size still, layer-in-use icon needs remake.
- Macbook retina support, use command line --no-native-pixels to disable it
- Timeline Marker label was drawing wrong
- Trackpad and magic mouse: supports zoom (hold ctrl)
- Fix for splash position: removed ghost function and made window size update after creation immediate
- Fast undo buffer save now adds UI as well. Could be checked for regular file save even...
Quit.blend and temp file saving use this now.
- Dixed filename in window on reading quit.blend or temp saves, and they now add a warning in window title: "(Recovered)"
- New Userpref option "Keep Session" - this always saves quit.blend, and loads on start.
This allows keeping UI and data without actual saves, until you actually save.
When you load startup.blend and quit, it recognises the quit.blend as a startup (no file name in header)
- Added 3D view copy/paste buffers (selected objects). Shortcuts ctrl-c, ctrl-v (OSX, cmd-c, cmd-v).
Coded partial file saving for it. Could be used for other purposes. Todo: use OS clipboards.
- User preferences (themes, keymaps, user settings) now can be saved as a separate file.
Old option is called "Save Startup File" the new one "Save User Settings".
To visualise this difference, the 'save startup file' button has been removed from user preferences window. That option is available as CTRL+U and in File menu still.
- OSX: fixed bug that stopped giving mouse events outside window.
This also fixes "Continuous Grab" for OSX. (error since 2009)
2012-12-12 18:58:11 +00:00
|
|
|
block->bounds = 2.5f * UI_UNIT_X;
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_block_bounds_calc_text(block, block->rect.xmin);
|
2009-02-04 11:52:16 +00:00
|
|
|
}
|
|
|
|
}
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
|
|
|
|
/* next we recompute bounds */
|
2012-03-30 01:51:25 +00:00
|
|
|
block->bounds = oldbounds;
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_block_bounds_calc(block);
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
|
|
|
|
/* and we adjust the position to fit within window */
|
2012-09-15 11:48:20 +00:00
|
|
|
width = BLI_rctf_size_x(&block->rect);
|
|
|
|
height = BLI_rctf_size_y(&block->rect);
|
2011-04-21 13:11:51 +00:00
|
|
|
|
2010-03-22 09:30:00 +00:00
|
|
|
/* avoid divide by zero below, caused by calling with no UI, but better not crash */
|
2012-03-30 01:51:25 +00:00
|
|
|
oldwidth = oldwidth > 0 ? oldwidth : MAX2(1, width);
|
|
|
|
oldheight = oldheight > 0 ? oldheight : MAX2(1, height);
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
|
2010-02-01 11:13:55 +00:00
|
|
|
/* offset block based on mouse position, user offset is scaled
|
2014-11-09 21:20:40 +01:00
|
|
|
* along in case we resized the block in ui_block_bounds_calc_text */
|
2019-03-13 16:35:24 +11:00
|
|
|
raw_x = rect.xmin = xy[0] + block->rect.xmin + (block->bounds_offset[0] * width) / oldwidth;
|
|
|
|
raw_y = rect.ymin = xy[1] + block->rect.ymin + (block->bounds_offset[1] * height) / oldheight;
|
2015-10-17 00:23:57 +11:00
|
|
|
rect.xmax = rect.xmin + width;
|
|
|
|
rect.ymax = rect.ymin + height;
|
|
|
|
|
|
|
|
rect_bounds.xmin = margin;
|
|
|
|
rect_bounds.ymin = margin;
|
|
|
|
rect_bounds.xmax = xmax - margin;
|
|
|
|
rect_bounds.ymax = ymax - UI_POPUP_MENU_TOP;
|
|
|
|
|
|
|
|
BLI_rcti_clamp(&rect, &rect_bounds, ofs_dummy);
|
2018-05-13 18:23:21 +02:00
|
|
|
UI_block_translate(block, rect.xmin - block->rect.xmin, rect.ymin - block->rect.ymin);
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
|
|
|
|
/* now recompute bounds and safety */
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_block_bounds_calc(block);
|
2017-10-12 14:43:45 +02:00
|
|
|
|
|
|
|
/* If given, adjust input coordinates such that they would generate real final popup position.
|
2019-01-15 23:24:20 +11:00
|
|
|
* Needed to handle correctly floating panels once they have been dragged around,
|
|
|
|
* see T52999. */
|
2017-10-12 14:43:45 +02:00
|
|
|
if (r_xy) {
|
|
|
|
r_xy[0] = xy[0] + block->rect.xmin - raw_x;
|
|
|
|
r_xy[1] = xy[1] + block->rect.ymin - raw_y;
|
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
/* used for various cases */
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_bounds_set_normal(uiBlock *block, int addval)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
if (block == NULL)
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
return;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
block->bounds = addval;
|
2012-09-10 07:03:30 +00:00
|
|
|
block->bounds_type = UI_BLOCK_BOUNDS;
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
/* used for pulldowns */
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_bounds_set_text(uiBlock *block, int addval)
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
block->bounds = addval;
|
2012-09-10 07:03:30 +00:00
|
|
|
block->bounds_type = UI_BLOCK_BOUNDS_TEXT;
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2009-02-04 11:52:16 +00:00
|
|
|
/* used for block popups */
|
2019-03-13 16:35:24 +11:00
|
|
|
void UI_block_bounds_set_popup(uiBlock *block, int addval, const int bounds_offset[2])
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
block->bounds = addval;
|
2012-09-10 07:03:30 +00:00
|
|
|
block->bounds_type = UI_BLOCK_BOUNDS_POPUP_MOUSE;
|
2019-03-13 16:35:24 +11:00
|
|
|
if (bounds_offset != NULL) {
|
|
|
|
block->bounds_offset[0] = bounds_offset[0];
|
|
|
|
block->bounds_offset[1] = bounds_offset[1];
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
block->bounds_offset[0] = 0;
|
|
|
|
block->bounds_offset[1] = 0;
|
|
|
|
}
|
2009-02-04 11:52:16 +00:00
|
|
|
}
|
|
|
|
|
|
|
|
/* used for menu popups */
|
2019-03-13 16:35:24 +11:00
|
|
|
void UI_block_bounds_set_menu(uiBlock *block, int addval, const int bounds_offset[2])
|
2009-02-04 11:52:16 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
block->bounds = addval;
|
2012-09-10 07:03:30 +00:00
|
|
|
block->bounds_type = UI_BLOCK_BOUNDS_POPUP_MENU;
|
2019-03-13 16:35:24 +11:00
|
|
|
if (bounds_offset != NULL) {
|
|
|
|
block->bounds_offset[0] = bounds_offset[0];
|
|
|
|
block->bounds_offset[1] = bounds_offset[1];
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
block->bounds_offset[0] = 0;
|
|
|
|
block->bounds_offset[1] = 0;
|
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2009-11-23 13:58:55 +00:00
|
|
|
/* used for centered popups, i.e. splash */
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_bounds_set_centered(uiBlock *block, int addval)
|
2009-11-23 13:58:55 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
block->bounds = addval;
|
2012-09-10 07:03:30 +00:00
|
|
|
block->bounds_type = UI_BLOCK_BOUNDS_POPUP_CENTER;
|
2009-11-23 13:58:55 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_bounds_set_explicit(uiBlock *block, int minx, int miny, int maxx, int maxy)
|
2011-11-22 17:49:06 +00:00
|
|
|
{
|
2012-08-18 16:53:46 +00:00
|
|
|
block->rect.xmin = minx;
|
|
|
|
block->rect.ymin = miny;
|
|
|
|
block->rect.xmax = maxx;
|
|
|
|
block->rect.ymax = maxy;
|
2012-09-10 07:03:30 +00:00
|
|
|
block->bounds_type = UI_BLOCK_BOUNDS_NONE;
|
2011-11-22 17:49:06 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
static int ui_but_calc_float_precision(uiBut *but, double value)
|
2011-06-07 02:39:40 +00:00
|
|
|
{
|
2014-04-24 17:17:55 +02:00
|
|
|
int prec = (int)but->a2;
|
2011-06-07 04:06:10 +00:00
|
|
|
|
2014-04-24 17:17:55 +02:00
|
|
|
/* first check for various special cases:
|
|
|
|
* * If button is radians, we want additional precision (see T39861).
|
|
|
|
* * If prec is not set, we fallback to a simple default */
|
2014-11-09 21:20:40 +01:00
|
|
|
if (ui_but_is_unit_radians(but) && prec < 5) {
|
2014-04-24 17:17:55 +02:00
|
|
|
prec = 5;
|
|
|
|
}
|
|
|
|
else if (prec == -1) {
|
2012-03-30 01:51:25 +00:00
|
|
|
prec = (but->hardmax < 10.001f) ? 3 : 2;
|
2011-06-07 04:06:10 +00:00
|
|
|
}
|
2017-09-18 19:50:40 +02:00
|
|
|
else {
|
|
|
|
CLAMP(prec, 0, UI_PRECISION_FLOAT_MAX);
|
|
|
|
}
|
2011-06-07 04:06:10 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
return UI_calc_float_precision(prec, value);
|
2014-01-08 17:04:10 +01:00
|
|
|
}
|
2011-06-07 04:06:10 +00:00
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
/* ************** BLOCK ENDING FUNCTION ************* */
|
|
|
|
|
2009-06-10 11:43:21 +00:00
|
|
|
/* NOTE: if but->poin is allocated memory for every defbut, things fail... */
|
2013-07-27 08:06:46 +00:00
|
|
|
static bool ui_but_equals_old(const uiBut *but, const uiBut *oldbut)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2011-10-15 14:14:22 +00:00
|
|
|
/* various properties are being compared here, hopefully sufficient
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
* to catch all cases, but it is simple to add more checks later */
|
2013-04-04 02:05:11 +00:00
|
|
|
if (but->retval != oldbut->retval) return false;
|
|
|
|
if (but->rnapoin.data != oldbut->rnapoin.data) return false;
|
2014-02-08 01:33:39 +11:00
|
|
|
if (but->rnaprop != oldbut->rnaprop || but->rnaindex != oldbut->rnaindex) return false;
|
2013-04-04 02:05:11 +00:00
|
|
|
if (but->func != oldbut->func) return false;
|
|
|
|
if (but->funcN != oldbut->funcN) return false;
|
|
|
|
if (oldbut->func_arg1 != oldbut && but->func_arg1 != oldbut->func_arg1) return false;
|
|
|
|
if (oldbut->func_arg2 != oldbut && but->func_arg2 != oldbut->func_arg2) return false;
|
2013-07-27 08:06:46 +00:00
|
|
|
if (!but->funcN && ((but->poin != oldbut->poin && (uiBut *)oldbut->poin != oldbut) ||
|
|
|
|
(but->pointype != oldbut->pointype))) return false;
|
2013-04-04 02:05:11 +00:00
|
|
|
if (but->optype != oldbut->optype) return false;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2013-04-04 02:05:11 +00:00
|
|
|
return true;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2014-02-08 09:03:25 +11:00
|
|
|
uiBut *ui_but_find_old(uiBlock *block_old, const uiBut *but_new)
|
2014-02-08 01:46:43 +11:00
|
|
|
{
|
|
|
|
uiBut *but_old;
|
|
|
|
for (but_old = block_old->buttons.first; but_old; but_old = but_old->next) {
|
|
|
|
if (ui_but_equals_old(but_new, but_old)) {
|
|
|
|
break;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
return but_old;
|
|
|
|
}
|
2014-02-08 09:03:25 +11:00
|
|
|
uiBut *ui_but_find_new(uiBlock *block_new, const uiBut *but_old)
|
|
|
|
{
|
|
|
|
uiBut *but_new;
|
|
|
|
for (but_new = block_new->buttons.first; but_new; but_new = but_new->next) {
|
|
|
|
if (ui_but_equals_old(but_new, but_old)) {
|
|
|
|
break;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
return but_new;
|
|
|
|
}
|
2014-02-08 01:46:43 +11:00
|
|
|
|
|
|
|
/**
|
|
|
|
* \return true when \a but_p is set (only done for active buttons).
|
|
|
|
*/
|
2014-02-08 02:22:32 +11:00
|
|
|
static bool ui_but_update_from_old_block(const bContext *C, uiBlock *block, uiBut **but_p, uiBut **but_old_p)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2013-11-21 20:25:31 +01:00
|
|
|
const int drawflag_copy = 0; /* None currently. */
|
2013-10-09 15:36:04 +00:00
|
|
|
|
2014-02-08 02:22:32 +11:00
|
|
|
uiBlock *oldblock = block->oldblock;
|
|
|
|
uiBut *oldbut = NULL, *but = *but_p;
|
2014-02-08 01:46:43 +11:00
|
|
|
bool found_active = false;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2014-02-08 02:22:32 +11:00
|
|
|
|
|
|
|
#if 0
|
|
|
|
/* simple/stupid - search every time */
|
2014-02-08 01:46:43 +11:00
|
|
|
oldbut = ui_but_find_old(oldblock, but);
|
2014-02-08 02:22:32 +11:00
|
|
|
(void)but_old_p;
|
|
|
|
#else
|
|
|
|
BLI_assert(*but_old_p == NULL || BLI_findindex(&oldblock->buttons, *but_old_p) != -1);
|
|
|
|
|
|
|
|
/* fastpath - avoid loop-in-loop, calling 'ui_but_find_old'
|
|
|
|
* as long as old/new buttons are aligned */
|
|
|
|
if (LIKELY(*but_old_p && ui_but_equals_old(but, *but_old_p))) {
|
|
|
|
oldbut = *but_old_p;
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
/* fallback to block search */
|
|
|
|
oldbut = ui_but_find_old(oldblock, but);
|
|
|
|
}
|
|
|
|
(*but_old_p) = oldbut ? oldbut->next : NULL;
|
|
|
|
#endif
|
|
|
|
|
2014-02-08 01:46:43 +11:00
|
|
|
|
2014-02-08 02:22:32 +11:00
|
|
|
if (!oldbut) {
|
2014-02-08 01:46:43 +11:00
|
|
|
return found_active;
|
2014-02-08 02:22:32 +11:00
|
|
|
}
|
2014-02-08 01:46:43 +11:00
|
|
|
|
|
|
|
if (oldbut->active) {
|
2014-02-13 09:12:47 +11:00
|
|
|
/* flags from the buttons we want to refresh, may want to add more here... */
|
2017-07-27 12:55:17 +02:00
|
|
|
const int flag_copy = UI_BUT_REDALERT | UI_HAS_ICON;
|
2014-02-13 09:12:47 +11:00
|
|
|
|
2014-02-08 01:46:43 +11:00
|
|
|
found_active = true;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2010-11-07 05:35:41 +00:00
|
|
|
#if 0
|
2014-02-08 01:46:43 +11:00
|
|
|
but->flag = oldbut->flag;
|
|
|
|
but->active = oldbut->active;
|
|
|
|
but->pos = oldbut->pos;
|
|
|
|
but->ofs = oldbut->ofs;
|
|
|
|
but->editstr = oldbut->editstr;
|
|
|
|
but->editval = oldbut->editval;
|
|
|
|
but->editvec = oldbut->editvec;
|
|
|
|
but->editcoba = oldbut->editcoba;
|
|
|
|
but->editcumap = oldbut->editcumap;
|
|
|
|
but->selsta = oldbut->selsta;
|
|
|
|
but->selend = oldbut->selend;
|
|
|
|
but->softmin = oldbut->softmin;
|
|
|
|
but->softmax = oldbut->softmax;
|
|
|
|
oldbut->active = NULL;
|
2010-11-07 05:35:41 +00:00
|
|
|
#endif
|
2013-11-21 20:25:31 +01:00
|
|
|
|
2014-02-08 01:46:43 +11:00
|
|
|
/* move button over from oldblock to new block */
|
|
|
|
BLI_remlink(&oldblock->buttons, oldbut);
|
|
|
|
BLI_insertlinkafter(&block->buttons, but, oldbut);
|
|
|
|
oldbut->block = block;
|
|
|
|
*but_p = oldbut;
|
|
|
|
|
|
|
|
/* still stuff needs to be copied */
|
|
|
|
oldbut->rect = but->rect;
|
|
|
|
oldbut->context = but->context; /* set by Layout */
|
|
|
|
|
|
|
|
/* drawing */
|
|
|
|
oldbut->icon = but->icon;
|
|
|
|
oldbut->iconadd = but->iconadd;
|
|
|
|
oldbut->alignnr = but->alignnr;
|
|
|
|
|
|
|
|
/* typically the same pointers, but not on undo/redo */
|
|
|
|
/* XXX some menu buttons store button itself in but->poin. Ugly */
|
|
|
|
if (oldbut->poin != (char *)oldbut) {
|
|
|
|
SWAP(char *, oldbut->poin, but->poin);
|
|
|
|
SWAP(void *, oldbut->func_argN, but->func_argN);
|
2015-02-11 00:06:03 +01:00
|
|
|
SWAP(void *, oldbut->tip_argN, but->tip_argN);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2014-02-08 01:46:43 +11:00
|
|
|
|
|
|
|
oldbut->flag = (oldbut->flag & ~flag_copy) | (but->flag & flag_copy);
|
|
|
|
oldbut->drawflag = (oldbut->drawflag & ~drawflag_copy) | (but->drawflag & drawflag_copy);
|
|
|
|
|
|
|
|
/* copy hardmin for list rows to prevent 'sticking' highlight to mouse position
|
|
|
|
* when scrolling without moving mouse (see [#28432]) */
|
2016-02-02 14:06:06 +11:00
|
|
|
if (ELEM(oldbut->type, UI_BTYPE_ROW, UI_BTYPE_LISTROW)) {
|
2014-02-08 01:46:43 +11:00
|
|
|
oldbut->hardmax = but->hardmax;
|
2016-02-02 14:06:06 +11:00
|
|
|
}
|
|
|
|
|
|
|
|
/* Selectively copy a1, a2 since their use differs across all button types
|
|
|
|
* (and we'll probably split these out later) */
|
|
|
|
if (ELEM(oldbut->type, UI_BTYPE_PROGRESS_BAR)) {
|
|
|
|
oldbut->a1 = but->a1;
|
|
|
|
}
|
2014-02-08 01:46:43 +11:00
|
|
|
|
2014-12-01 23:30:54 +01:00
|
|
|
if (!BLI_listbase_is_empty(&block->butstore)) {
|
|
|
|
UI_butstore_register_update(block, oldbut, but);
|
|
|
|
}
|
|
|
|
|
2014-02-10 12:52:35 +11:00
|
|
|
/* move/copy string from the new button to the old */
|
|
|
|
/* needed for alt+mouse wheel over enums */
|
|
|
|
if (but->str != but->strdata) {
|
|
|
|
if (oldbut->str != oldbut->strdata) {
|
|
|
|
SWAP(char *, but->str, oldbut->str);
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
oldbut->str = but->str;
|
|
|
|
but->str = but->strdata;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
if (oldbut->str != oldbut->strdata) {
|
|
|
|
MEM_freeN(oldbut->str);
|
|
|
|
oldbut->str = oldbut->strdata;
|
|
|
|
}
|
|
|
|
BLI_strncpy(oldbut->strdata, but->strdata, sizeof(oldbut->strdata));
|
|
|
|
}
|
|
|
|
|
2015-08-10 18:01:11 +02:00
|
|
|
if (but->dragpoin && (but->dragflag & UI_BUT_DRAGPOIN_FREE)) {
|
|
|
|
SWAP(void *, but->dragpoin, oldbut->dragpoin);
|
|
|
|
}
|
|
|
|
|
2014-02-08 01:46:43 +11:00
|
|
|
BLI_remlink(&block->buttons, but);
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_free(C, but);
|
2014-02-08 01:46:43 +11:00
|
|
|
|
|
|
|
/* note: if layout hasn't been applied yet, it uses old button pointers... */
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2014-02-08 01:46:43 +11:00
|
|
|
else {
|
2014-02-13 09:12:47 +11:00
|
|
|
const int flag_copy = UI_BUT_DRAG_MULTI;
|
|
|
|
|
2014-02-08 09:34:01 +11:00
|
|
|
but->flag = (but->flag & ~flag_copy) | (oldbut->flag & flag_copy);
|
|
|
|
|
2014-02-08 01:46:43 +11:00
|
|
|
/* ensures one button can get activated, and in case the buttons
|
|
|
|
* draw are the same this gives O(1) lookup for each button */
|
|
|
|
BLI_remlink(&oldblock->buttons, oldbut);
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_free(C, oldbut);
|
2014-02-08 01:46:43 +11:00
|
|
|
}
|
|
|
|
|
|
|
|
return found_active;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2012-03-08 04:12:11 +00:00
|
|
|
/* needed for temporarily rename buttons, such as in outliner or file-select,
|
2011-11-11 13:09:14 +00:00
|
|
|
* they should keep calling uiDefButs to keep them alive */
|
2009-07-25 13:40:59 +00:00
|
|
|
/* returns 0 when button removed */
|
2014-11-09 21:20:40 +01:00
|
|
|
bool UI_but_active_only(const bContext *C, ARegion *ar, uiBlock *block, uiBut *but)
|
2009-07-25 13:40:59 +00:00
|
|
|
{
|
|
|
|
uiBlock *oldblock;
|
|
|
|
uiBut *oldbut;
|
2013-04-04 02:05:11 +00:00
|
|
|
bool activate = false, found = false, isactive = false;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
oldblock = block->oldblock;
|
2013-04-04 02:05:11 +00:00
|
|
|
if (!oldblock) {
|
|
|
|
activate = true;
|
|
|
|
}
|
2009-07-25 13:40:59 +00:00
|
|
|
else {
|
2014-02-08 01:46:43 +11:00
|
|
|
oldbut = ui_but_find_old(oldblock, but);
|
|
|
|
if (oldbut) {
|
|
|
|
found = true;
|
|
|
|
|
|
|
|
if (oldbut->active) {
|
|
|
|
isactive = true;
|
2009-07-25 13:40:59 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
2013-04-04 02:05:11 +00:00
|
|
|
if ((activate == true) || (found == false)) {
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_activate_event((bContext *)C, ar, but);
|
2009-07-25 13:40:59 +00:00
|
|
|
}
|
2013-04-04 02:05:11 +00:00
|
|
|
else if ((found == true) && (isactive == false)) {
|
2009-07-25 13:40:59 +00:00
|
|
|
BLI_remlink(&block->buttons, but);
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_free(C, but);
|
2013-04-04 02:05:11 +00:00
|
|
|
return false;
|
2009-07-25 13:40:59 +00:00
|
|
|
}
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2013-04-04 02:05:11 +00:00
|
|
|
return true;
|
2009-07-25 13:40:59 +00:00
|
|
|
}
|
|
|
|
|
2019-03-20 22:40:38 +11:00
|
|
|
bool UI_block_active_only_flagged_buttons(const bContext *C, ARegion *ar, uiBlock *block)
|
|
|
|
{
|
|
|
|
bool done = false;
|
|
|
|
for (uiBut *but = block->buttons.first; but; but = but->next) {
|
|
|
|
if (!done && ui_but_is_editable(but)) {
|
|
|
|
if (but->flag & UI_BUT_ACTIVATE_ON_INIT) {
|
|
|
|
if (UI_but_active_only(C, ar, block, but)) {
|
|
|
|
done = true;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
but->flag &= ~UI_BUT_ACTIVATE_ON_INIT;
|
|
|
|
}
|
|
|
|
return done;
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
|
|
|
2013-02-22 05:56:20 +00:00
|
|
|
/* simulate button click */
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_but_execute(const bContext *C, uiBut *but)
|
2013-02-22 05:56:20 +00:00
|
|
|
{
|
2014-02-08 09:12:59 +11:00
|
|
|
ARegion *ar = CTX_wm_region(C);
|
|
|
|
void *active_back;
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_execute_begin((bContext *)C, ar, but, &active_back);
|
2014-02-08 09:12:59 +11:00
|
|
|
/* Value is applied in begin. No further action required. */
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_execute_end((bContext *)C, ar, but, active_back);
|
2013-02-22 05:56:20 +00:00
|
|
|
}
|
|
|
|
|
2012-03-18 07:38:51 +00:00
|
|
|
/* use to check if we need to disable undo, but don't make any changes
|
2014-01-04 17:16:19 +11:00
|
|
|
* returns false if undo needs to be disabled. */
|
2014-11-09 21:20:40 +01:00
|
|
|
static bool ui_but_is_rna_undo(const uiBut *but)
|
2011-08-18 19:07:37 +00:00
|
|
|
{
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->rnapoin.id.data) {
|
2011-08-18 19:07:37 +00:00
|
|
|
/* avoid undo push for buttons who's ID are screen or wm level
|
|
|
|
* we could disable undo for buttons with no ID too but may have
|
2011-10-15 14:14:22 +00:00
|
|
|
* unforeseen consequences, so best check for ID's we _know_ are not
|
2011-08-18 19:07:37 +00:00
|
|
|
* handled by undo - campbell */
|
2012-03-30 01:51:25 +00:00
|
|
|
ID *id = but->rnapoin.id.data;
|
2014-01-04 17:16:19 +11:00
|
|
|
if (ID_CHECK_UNDO(id) == false) {
|
|
|
|
return false;
|
2011-08-18 19:07:37 +00:00
|
|
|
}
|
|
|
|
else {
|
2014-01-04 17:16:19 +11:00
|
|
|
return true;
|
2011-08-18 19:07:37 +00:00
|
|
|
}
|
|
|
|
}
|
2011-11-08 15:11:27 +00:00
|
|
|
else if (but->rnapoin.type && !RNA_struct_undo_check(but->rnapoin.type)) {
|
2014-01-04 17:16:19 +11:00
|
|
|
return false;
|
2011-11-08 15:11:27 +00:00
|
|
|
}
|
2011-08-18 19:07:37 +00:00
|
|
|
|
2014-01-04 17:16:19 +11:00
|
|
|
return true;
|
2011-08-18 19:07:37 +00:00
|
|
|
}
|
|
|
|
|
2010-12-14 02:38:29 +00:00
|
|
|
/* assigns automatic keybindings to menu items for fast access
|
|
|
|
* (underline key in menu) */
|
|
|
|
static void ui_menu_block_set_keyaccels(uiBlock *block)
|
|
|
|
{
|
|
|
|
uiBut *but;
|
|
|
|
|
2019-01-04 11:05:53 +11:00
|
|
|
uint menu_key_mask = 0;
|
|
|
|
uchar menu_key;
|
2010-12-14 02:38:29 +00:00
|
|
|
const char *str_pt;
|
|
|
|
int pass;
|
2012-03-30 01:51:25 +00:00
|
|
|
int tot_missing = 0;
|
2010-12-14 02:38:29 +00:00
|
|
|
|
|
|
|
/* only do it before bounding */
|
2012-08-18 16:53:46 +00:00
|
|
|
if (block->rect.xmin != block->rect.xmax)
|
2010-12-14 02:38:29 +00:00
|
|
|
return;
|
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
for (pass = 0; pass < 2; pass++) {
|
2010-12-14 02:38:29 +00:00
|
|
|
/* 2 Passes, on for first letter only, second for any letter if first fails
|
|
|
|
* fun first pass on all buttons so first word chars always get first priority */
|
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
for (but = block->buttons.first; but; but = but->next) {
|
2014-11-11 16:52:03 +01:00
|
|
|
if (!ELEM(but->type,
|
|
|
|
UI_BTYPE_BUT,
|
|
|
|
UI_BTYPE_BUT_MENU,
|
|
|
|
UI_BTYPE_MENU, UI_BTYPE_BLOCK,
|
|
|
|
UI_BTYPE_PULLDOWN) ||
|
|
|
|
(but->flag & UI_HIDDEN))
|
|
|
|
{
|
2010-12-14 02:38:29 +00:00
|
|
|
/* pass */
|
|
|
|
}
|
2012-03-30 01:51:25 +00:00
|
|
|
else if (but->menu_key == '\0') {
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->str) {
|
2012-03-30 01:51:25 +00:00
|
|
|
for (str_pt = but->str; *str_pt; ) {
|
|
|
|
menu_key = tolower(*str_pt);
|
|
|
|
if ((menu_key >= 'a' && menu_key <= 'z') && !(menu_key_mask & 1 << (menu_key - 'a'))) {
|
|
|
|
menu_key_mask |= 1 << (menu_key - 'a');
|
2010-12-14 02:38:29 +00:00
|
|
|
break;
|
|
|
|
}
|
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
if (pass == 0) {
|
2012-03-08 04:12:11 +00:00
|
|
|
/* Skip to next delimiter on first pass (be picky) */
|
2012-03-24 06:38:07 +00:00
|
|
|
while (isalpha(*str_pt))
|
2010-12-14 02:38:29 +00:00
|
|
|
str_pt++;
|
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (*str_pt)
|
2010-12-14 02:38:29 +00:00
|
|
|
str_pt++;
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
/* just step over every char second pass and find first usable key */
|
|
|
|
str_pt++;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (*str_pt) {
|
2012-03-30 01:51:25 +00:00
|
|
|
but->menu_key = menu_key;
|
2010-12-14 02:38:29 +00:00
|
|
|
}
|
|
|
|
else {
|
|
|
|
/* run second pass */
|
|
|
|
tot_missing++;
|
|
|
|
}
|
|
|
|
|
|
|
|
/* if all keys have been used just exit, unlikely */
|
2012-03-30 01:51:25 +00:00
|
|
|
if (menu_key_mask == (1 << 26) - 1) {
|
2010-12-14 02:38:29 +00:00
|
|
|
return;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
/* check if second pass is needed */
|
2012-03-24 06:38:07 +00:00
|
|
|
if (!tot_missing) {
|
2010-12-14 02:38:29 +00:00
|
|
|
break;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2012-01-21 22:42:09 +00:00
|
|
|
/* XXX, this code will shorten any allocated string to 'UI_MAX_NAME_STR'
|
|
|
|
* since this is really long its unlikely to be an issue,
|
|
|
|
* but this could be supported */
|
2013-04-04 15:16:29 +00:00
|
|
|
void ui_but_add_shortcut(uiBut *but, const char *shortcut_str, const bool do_strip)
|
2012-01-21 22:42:09 +00:00
|
|
|
{
|
2014-05-01 12:40:49 +10:00
|
|
|
if (do_strip && (but->flag & UI_BUT_HAS_SEP_CHAR)) {
|
|
|
|
char *cpoin = strrchr(but->str, UI_SEP_CHAR);
|
2012-03-24 06:38:07 +00:00
|
|
|
if (cpoin) {
|
2012-03-30 01:51:25 +00:00
|
|
|
*cpoin = '\0';
|
2012-01-21 22:42:09 +00:00
|
|
|
}
|
2014-05-01 12:40:49 +10:00
|
|
|
but->flag &= ~UI_BUT_HAS_SEP_CHAR;
|
2012-01-21 22:42:09 +00:00
|
|
|
}
|
|
|
|
|
|
|
|
/* without this, just allow stripping of the shortcut */
|
|
|
|
if (shortcut_str) {
|
|
|
|
char *butstr_orig;
|
|
|
|
|
|
|
|
if (but->str != but->strdata) {
|
|
|
|
butstr_orig = but->str; /* free after using as source buffer */
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
butstr_orig = BLI_strdup(but->str);
|
|
|
|
}
|
2018-07-01 19:57:31 +02:00
|
|
|
BLI_snprintf(
|
|
|
|
but->strdata,
|
|
|
|
sizeof(but->strdata),
|
|
|
|
"%s" UI_SEP_CHAR_S "%s",
|
|
|
|
butstr_orig, shortcut_str);
|
2012-01-21 22:42:09 +00:00
|
|
|
MEM_freeN(butstr_orig);
|
|
|
|
but->str = but->strdata;
|
2014-05-01 12:40:49 +10:00
|
|
|
but->flag |= UI_BUT_HAS_SEP_CHAR;
|
2018-07-03 13:04:21 +02:00
|
|
|
but->drawflag |= UI_BUT_HAS_SHORTCUT;
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update(but);
|
2012-01-21 22:42:09 +00:00
|
|
|
}
|
|
|
|
}
|
2010-12-14 02:38:29 +00:00
|
|
|
|
2018-07-13 10:26:24 +02:00
|
|
|
/* -------------------------------------------------------------------- */
|
|
|
|
/** \name Find Key Shortcut for Button
|
|
|
|
*
|
|
|
|
* - #ui_but_event_operator_string (and helpers)
|
|
|
|
* - #ui_but_event_property_operator_string
|
|
|
|
* \{ */
|
|
|
|
|
|
|
|
static bool ui_but_event_operator_string_from_operator(
|
2017-11-06 00:04:46 +11:00
|
|
|
const bContext *C, uiBut *but,
|
|
|
|
char *buf, const size_t buf_len)
|
2009-01-15 04:13:38 +00:00
|
|
|
{
|
2018-07-13 10:26:24 +02:00
|
|
|
BLI_assert(but->optype != NULL);
|
2013-06-01 05:53:44 +00:00
|
|
|
bool found = false;
|
2018-07-13 10:26:24 +02:00
|
|
|
IDProperty *prop = (but->opptr) ? but->opptr->data : NULL;
|
2011-11-14 14:42:47 +00:00
|
|
|
|
2018-07-13 10:26:24 +02:00
|
|
|
if (WM_key_event_operator_string(
|
|
|
|
C, but->optype->idname, but->opcontext, prop, true,
|
|
|
|
buf, buf_len))
|
|
|
|
{
|
|
|
|
found = true;
|
2013-06-01 05:53:44 +00:00
|
|
|
}
|
2018-07-13 10:26:24 +02:00
|
|
|
return found;
|
|
|
|
}
|
2011-11-14 14:42:47 +00:00
|
|
|
|
2018-07-13 10:26:24 +02:00
|
|
|
static bool ui_but_event_operator_string_from_menu(
|
|
|
|
const bContext *C, uiBut *but,
|
|
|
|
char *buf, const size_t buf_len)
|
|
|
|
{
|
|
|
|
MenuType *mt = UI_but_menutype_get(but);
|
|
|
|
BLI_assert(mt != NULL);
|
2011-11-14 14:42:47 +00:00
|
|
|
|
2018-07-13 10:26:24 +02:00
|
|
|
bool found = false;
|
2018-07-13 10:51:49 +02:00
|
|
|
IDProperty *prop_menu;
|
2013-06-01 05:53:44 +00:00
|
|
|
|
2018-07-13 10:26:24 +02:00
|
|
|
/* annoying, create a property */
|
|
|
|
IDPropertyTemplate val = {0};
|
|
|
|
prop_menu = IDP_New(IDP_GROUP, &val, __func__); /* dummy, name is unimportant */
|
2018-07-13 10:51:49 +02:00
|
|
|
IDP_AddToGroup(prop_menu, IDP_NewString(mt->idname, "name", sizeof(mt->idname)));
|
2018-07-13 10:26:24 +02:00
|
|
|
|
|
|
|
if (WM_key_event_operator_string(
|
2018-11-21 06:21:58 +11:00
|
|
|
C, "WM_OT_call_menu", WM_OP_INVOKE_REGION_WIN, prop_menu, true,
|
|
|
|
buf, buf_len))
|
2018-07-13 10:26:24 +02:00
|
|
|
{
|
|
|
|
found = true;
|
|
|
|
}
|
|
|
|
|
|
|
|
IDP_FreeProperty(prop_menu);
|
|
|
|
MEM_freeN(prop_menu);
|
|
|
|
return found;
|
|
|
|
}
|
|
|
|
|
2018-07-13 10:57:25 +02:00
|
|
|
static bool ui_but_event_operator_string_from_panel(
|
|
|
|
const bContext *C, uiBut *but,
|
|
|
|
char *buf, const size_t buf_len)
|
|
|
|
{
|
|
|
|
/** Nearly exact copy of #ui_but_event_operator_string_from_menu */
|
|
|
|
PanelType *pt = UI_but_paneltype_get(but);
|
|
|
|
BLI_assert(pt != NULL);
|
|
|
|
|
|
|
|
bool found = false;
|
|
|
|
IDProperty *prop_panel;
|
|
|
|
|
|
|
|
/* annoying, create a property */
|
|
|
|
IDPropertyTemplate val = {0};
|
|
|
|
prop_panel = IDP_New(IDP_GROUP, &val, __func__); /* dummy, name is unimportant */
|
|
|
|
IDP_AddToGroup(prop_panel, IDP_NewString(pt->idname, "name", sizeof(pt->idname)));
|
|
|
|
IDP_AddToGroup(prop_panel, IDP_New(IDP_INT, &(IDPropertyTemplate){ .i = pt->space_type, }, "space_type"));
|
|
|
|
IDP_AddToGroup(prop_panel, IDP_New(IDP_INT, &(IDPropertyTemplate){ .i = pt->region_type, }, "region_type"));
|
|
|
|
|
|
|
|
for (int i = 0; i < 2; i++) {
|
|
|
|
/* FIXME(campbell): We can't reasonably search all configurations - long term. */
|
|
|
|
IDP_ReplaceInGroup(prop_panel, IDP_New(IDP_INT, &(IDPropertyTemplate){ .i = i, }, "keep_open"));
|
|
|
|
if (WM_key_event_operator_string(
|
|
|
|
C, "WM_OT_call_panel", WM_OP_INVOKE_REGION_WIN, prop_panel, true,
|
|
|
|
buf, buf_len))
|
|
|
|
{
|
|
|
|
found = true;
|
|
|
|
break;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
IDP_FreeProperty(prop_panel);
|
|
|
|
MEM_freeN(prop_panel);
|
|
|
|
return found;
|
|
|
|
}
|
|
|
|
|
2018-07-13 10:26:24 +02:00
|
|
|
static bool ui_but_event_operator_string(
|
|
|
|
const bContext *C, uiBut *but,
|
|
|
|
char *buf, const size_t buf_len)
|
|
|
|
{
|
|
|
|
bool found = false;
|
|
|
|
|
|
|
|
if (but->optype != NULL) {
|
|
|
|
found = ui_but_event_operator_string_from_operator(C, but, buf, buf_len);
|
|
|
|
}
|
|
|
|
else if (UI_but_menutype_get(but) != NULL) {
|
|
|
|
found = ui_but_event_operator_string_from_menu(C, but, buf, buf_len);
|
2011-11-14 14:42:47 +00:00
|
|
|
}
|
2018-07-13 10:57:25 +02:00
|
|
|
else if (UI_but_paneltype_get(but) != NULL) {
|
|
|
|
found = ui_but_event_operator_string_from_panel(C, but, buf, buf_len);
|
|
|
|
}
|
2011-11-14 14:42:47 +00:00
|
|
|
|
2013-06-01 05:53:44 +00:00
|
|
|
return found;
|
|
|
|
}
|
2011-11-14 14:42:47 +00:00
|
|
|
|
2017-11-06 00:04:46 +11:00
|
|
|
static bool ui_but_event_property_operator_string(
|
|
|
|
const bContext *C, uiBut *but,
|
|
|
|
char *buf, const size_t buf_len)
|
|
|
|
{
|
2013-07-07 12:53:15 +00:00
|
|
|
/* context toggle operator names to check... */
|
2018-11-30 08:01:41 +11:00
|
|
|
|
|
|
|
/* This function could use a refactor to generalize button type to operator relationship
|
|
|
|
* as well as which operators use properties.
|
|
|
|
* - Campbell
|
|
|
|
* */
|
2013-07-07 12:53:15 +00:00
|
|
|
const char *ctx_toggle_opnames[] = {
|
|
|
|
"WM_OT_context_toggle",
|
|
|
|
"WM_OT_context_toggle_enum",
|
|
|
|
"WM_OT_context_cycle_int",
|
|
|
|
"WM_OT_context_cycle_enum",
|
|
|
|
"WM_OT_context_cycle_array",
|
|
|
|
"WM_OT_context_menu_enum",
|
2019-02-03 14:01:45 +11:00
|
|
|
NULL,
|
2018-05-23 10:47:12 +02:00
|
|
|
};
|
2018-11-29 17:32:35 +11:00
|
|
|
|
|
|
|
const char *ctx_enum_opnames[] = {
|
|
|
|
"WM_OT_context_set_enum",
|
2019-02-03 14:01:45 +11:00
|
|
|
NULL,
|
2018-11-29 17:32:35 +11:00
|
|
|
};
|
|
|
|
|
2018-11-29 18:45:37 +11:00
|
|
|
const char *ctx_enum_opnames_for_Area_ui_type[] = {
|
|
|
|
"SCREEN_OT_space_type_set_or_cycle",
|
2019-02-03 14:01:45 +11:00
|
|
|
NULL,
|
2018-11-29 18:45:37 +11:00
|
|
|
};
|
|
|
|
|
|
|
|
const char **opnames = ctx_toggle_opnames;
|
|
|
|
int opnames_len = ARRAY_SIZE(ctx_toggle_opnames);
|
|
|
|
|
|
|
|
|
2018-11-29 17:32:35 +11:00
|
|
|
int prop_enum_value = -1;
|
|
|
|
bool prop_enum_value_ok = false;
|
2018-11-29 18:45:37 +11:00
|
|
|
bool prop_enum_value_is_int = false;
|
|
|
|
const char *prop_enum_value_id = "value";
|
2018-11-29 17:32:35 +11:00
|
|
|
PointerRNA *ptr = &but->rnapoin;
|
|
|
|
PropertyRNA *prop = but->rnaprop;
|
|
|
|
if ((but->type == UI_BTYPE_BUT_MENU) && (but->block->handle != NULL)) {
|
|
|
|
uiBut *but_parent = but->block->handle->popup_create_vars.but;
|
|
|
|
if ((but->type == UI_BTYPE_BUT_MENU) &&
|
2018-11-30 08:01:41 +11:00
|
|
|
(but_parent && but_parent->rnaprop) &&
|
2018-11-29 18:45:37 +11:00
|
|
|
(RNA_property_type(but_parent->rnaprop) == PROP_ENUM) &&
|
2018-11-30 08:01:41 +11:00
|
|
|
(but_parent->menu_create_func == ui_def_but_rna__menu))
|
2018-11-29 17:32:35 +11:00
|
|
|
{
|
|
|
|
prop_enum_value = (int)but->hardmin;
|
2018-11-29 18:45:37 +11:00
|
|
|
ptr = &but_parent->rnapoin;
|
|
|
|
prop = but_parent->rnaprop;
|
2018-11-29 17:32:35 +11:00
|
|
|
prop_enum_value_ok = true;
|
2018-11-29 18:45:37 +11:00
|
|
|
|
|
|
|
opnames = ctx_enum_opnames;
|
|
|
|
opnames_len = ARRAY_SIZE(ctx_enum_opnames);
|
2018-11-29 17:32:35 +11:00
|
|
|
}
|
|
|
|
}
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2013-07-07 12:53:15 +00:00
|
|
|
bool found = false;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2018-11-29 17:32:35 +11:00
|
|
|
/* Don't use the button again. */
|
|
|
|
but = NULL;
|
|
|
|
|
2019-01-15 23:24:20 +11:00
|
|
|
/* this version is only for finding hotkeys for properties
|
|
|
|
* (which get set via context using operators) */
|
2018-11-29 17:32:35 +11:00
|
|
|
if (prop) {
|
2018-05-23 10:47:12 +02:00
|
|
|
/* to avoid massive slowdowns on property panels, for now, we only check the
|
2013-07-07 12:53:15 +00:00
|
|
|
* hotkeys for Editor / Scene settings...
|
|
|
|
*
|
|
|
|
* TODO: userpref settings?
|
|
|
|
*/
|
|
|
|
char *data_path = NULL;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2018-11-29 17:32:35 +11:00
|
|
|
if (ptr->id.data) {
|
|
|
|
ID *id = ptr->id.data;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2013-07-07 12:53:15 +00:00
|
|
|
if (GS(id->name) == ID_SCR) {
|
2018-05-23 10:47:12 +02:00
|
|
|
/* screen/editor property
|
|
|
|
* NOTE: in most cases, there is actually no info for backwards tracing
|
2013-07-07 12:53:15 +00:00
|
|
|
* how to get back to ID from the editor data we may be dealing with
|
|
|
|
*/
|
2018-11-29 17:32:35 +11:00
|
|
|
if (RNA_struct_is_a(ptr->type, &RNA_Space)) {
|
|
|
|
/* data should be directly on here... */
|
|
|
|
data_path = BLI_sprintfN("space_data.%s", RNA_property_identifier(prop));
|
|
|
|
}
|
|
|
|
else if (RNA_struct_is_a(ptr->type, &RNA_Area)) {
|
2013-07-07 12:53:15 +00:00
|
|
|
/* data should be directly on here... */
|
2018-11-29 17:32:35 +11:00
|
|
|
const char *prop_id = RNA_property_identifier(prop);
|
2018-11-29 18:45:37 +11:00
|
|
|
/* Hack since keys access 'type', UI shows 'ui_type'. */
|
2018-11-29 17:32:35 +11:00
|
|
|
if (STREQ(prop_id, "ui_type")) {
|
|
|
|
prop_id = "type";
|
|
|
|
prop_enum_value >>= 16;
|
|
|
|
prop = RNA_struct_find_property(ptr, prop_id);
|
2018-11-29 18:45:37 +11:00
|
|
|
|
|
|
|
opnames = ctx_enum_opnames_for_Area_ui_type;
|
|
|
|
opnames_len = ARRAY_SIZE(ctx_enum_opnames_for_Area_ui_type);
|
|
|
|
prop_enum_value_id = "space_type";
|
|
|
|
prop_enum_value_is_int = true;
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
data_path = BLI_sprintfN("area.%s", prop_id);
|
2018-11-29 17:32:35 +11:00
|
|
|
}
|
2013-07-07 12:53:15 +00:00
|
|
|
}
|
|
|
|
else {
|
|
|
|
/* special exceptions for common nested data in editors... */
|
2018-11-29 17:32:35 +11:00
|
|
|
if (RNA_struct_is_a(ptr->type, &RNA_DopeSheet)) {
|
2013-07-07 12:53:15 +00:00
|
|
|
/* dopesheet filtering options... */
|
2018-11-29 17:32:35 +11:00
|
|
|
data_path = BLI_sprintfN("space_data.dopesheet.%s", RNA_property_identifier(prop));
|
2013-07-07 12:53:15 +00:00
|
|
|
}
|
2018-11-29 17:32:35 +11:00
|
|
|
else if (RNA_struct_is_a(ptr->type, &RNA_FileSelectParams)) {
|
2015-12-03 12:50:09 +01:00
|
|
|
/* Filebrowser options... */
|
2018-11-29 17:32:35 +11:00
|
|
|
data_path = BLI_sprintfN("space_data.params.%s", RNA_property_identifier(prop));
|
2015-12-03 12:50:09 +01:00
|
|
|
}
|
2013-07-07 12:53:15 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
else if (GS(id->name) == ID_SCE) {
|
2018-11-29 17:32:35 +11:00
|
|
|
if (RNA_struct_is_a(ptr->type, &RNA_ToolSettings)) {
|
2018-05-23 10:47:12 +02:00
|
|
|
/* toolsettings property
|
2013-07-07 12:53:15 +00:00
|
|
|
* NOTE: toolsettings is usually accessed directly (i.e. not through scene)
|
|
|
|
*/
|
2018-11-29 17:32:35 +11:00
|
|
|
data_path = RNA_path_from_ID_to_property(ptr, prop);
|
2013-07-07 12:53:15 +00:00
|
|
|
}
|
|
|
|
else {
|
|
|
|
/* scene property */
|
2018-11-29 17:32:35 +11:00
|
|
|
char *path = RNA_path_from_ID_to_property(ptr, prop);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2013-07-07 12:53:15 +00:00
|
|
|
if (path) {
|
|
|
|
data_path = BLI_sprintfN("scene.%s", path);
|
|
|
|
MEM_freeN(path);
|
|
|
|
}
|
2013-09-14 12:04:10 +00:00
|
|
|
#if 0
|
|
|
|
else {
|
|
|
|
printf("ERROR in %s(): Couldn't get path for scene property - %s\n",
|
2018-11-29 17:32:35 +11:00
|
|
|
__func__, RNA_property_identifier(prop));
|
2013-09-14 12:04:10 +00:00
|
|
|
}
|
|
|
|
#endif
|
2013-07-07 12:53:15 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
//puts("other id");
|
|
|
|
}
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2018-11-29 17:32:35 +11:00
|
|
|
//printf("prop shortcut: '%s' (%s)\n", RNA_property_identifier(prop), data_path);
|
2013-07-07 12:53:15 +00:00
|
|
|
}
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2013-07-07 12:53:15 +00:00
|
|
|
/* we have a datapath! */
|
2018-12-02 15:09:15 +11:00
|
|
|
if (data_path || (prop_enum_value_ok && prop_enum_value_id)) {
|
2013-07-07 12:53:15 +00:00
|
|
|
/* create a property to host the "datapath" property we're sending to the operators */
|
|
|
|
IDProperty *prop_path;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2013-07-07 12:53:15 +00:00
|
|
|
IDPropertyTemplate val = {0};
|
|
|
|
prop_path = IDP_New(IDP_GROUP, &val, __func__);
|
2018-11-29 18:45:37 +11:00
|
|
|
if (data_path) {
|
|
|
|
IDP_AddToGroup(prop_path, IDP_NewString(data_path, "data_path", strlen(data_path) + 1));
|
|
|
|
}
|
|
|
|
if (prop_enum_value_ok) {
|
2018-11-29 17:32:35 +11:00
|
|
|
const EnumPropertyItem *item;
|
|
|
|
bool free;
|
|
|
|
RNA_property_enum_items((bContext *)C, ptr, prop, &item, NULL, &free);
|
|
|
|
int index = RNA_enum_from_value(item, prop_enum_value);
|
|
|
|
if (index != -1) {
|
2018-11-29 18:45:37 +11:00
|
|
|
IDProperty *prop_value;
|
|
|
|
if (prop_enum_value_is_int) {
|
|
|
|
int value = item[index].value;
|
|
|
|
prop_value = IDP_New(IDP_INT, &(IDPropertyTemplate){ .i = value, }, prop_enum_value_id);
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
const char *id = item[index].identifier;
|
|
|
|
prop_value = IDP_NewString(id, prop_enum_value_id, strlen(id) + 1);
|
|
|
|
}
|
|
|
|
IDP_AddToGroup(prop_path, prop_value);
|
2018-11-29 17:32:35 +11:00
|
|
|
}
|
|
|
|
else {
|
|
|
|
opnames_len = 0; /* Do nothing. */
|
|
|
|
}
|
|
|
|
if (free) {
|
|
|
|
MEM_freeN((void *)item);
|
|
|
|
}
|
|
|
|
}
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2013-07-07 12:53:15 +00:00
|
|
|
/* check each until one works... */
|
2018-11-29 17:32:35 +11:00
|
|
|
|
|
|
|
for (int i = 0; (i < opnames_len) && (opnames[i]); i++) {
|
2017-11-06 00:04:46 +11:00
|
|
|
if (WM_key_event_operator_string(
|
2018-11-29 17:32:35 +11:00
|
|
|
C, opnames[i], WM_OP_INVOKE_REGION_WIN, prop_path, false,
|
2018-11-21 06:21:58 +11:00
|
|
|
buf, buf_len))
|
2013-07-07 12:53:15 +00:00
|
|
|
{
|
|
|
|
found = true;
|
|
|
|
break;
|
|
|
|
}
|
|
|
|
}
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2013-07-07 12:53:15 +00:00
|
|
|
/* cleanup */
|
|
|
|
IDP_FreeProperty(prop_path);
|
|
|
|
MEM_freeN(prop_path);
|
2018-11-29 18:45:37 +11:00
|
|
|
if (data_path) {
|
|
|
|
MEM_freeN(data_path);
|
|
|
|
}
|
2013-07-07 12:53:15 +00:00
|
|
|
}
|
|
|
|
}
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2013-07-07 12:53:15 +00:00
|
|
|
return found;
|
|
|
|
}
|
|
|
|
|
2018-07-13 10:26:24 +02:00
|
|
|
/** \} */
|
|
|
|
|
2015-05-31 14:20:03 +10:00
|
|
|
/**
|
|
|
|
* This goes in a seemingly weird pattern:
|
Pie Menus C code backend.
This commit merges the code in the pie-menu branch.
As per decisions taken the last few days, there are no pie menus
included and there will be an official add-on including overrides of
some keys with pie menus. However, people will now be able to use the
new code in python.
Full Documentation is in http://wiki.blender.org/index.php/Dev:Ref/
Thanks:
Campbell Barton, Dalai Felinto and Ton Roosendaal for the code review
and design comments
Jonathan Williamson, Pawel Lyczkowski, Pablo Vazquez among others for
suggestions during the development.
Special Thanks to Sean Olson, for his support, suggestions, testing and
merciless bugging so that I would finish the pie menu code. Without him
we wouldn't be here. Also to the rest of the developers of the original
python add-on, Patrick Moore and Dan Eicher and finally to Matt Ebb, who
did the research and first implementation and whose code I used to get
started.
2014-08-11 10:39:59 +02:00
|
|
|
*
|
2015-05-31 14:20:03 +10:00
|
|
|
* <pre>
|
Pie Menus C code backend.
This commit merges the code in the pie-menu branch.
As per decisions taken the last few days, there are no pie menus
included and there will be an official add-on including overrides of
some keys with pie menus. However, people will now be able to use the
new code in python.
Full Documentation is in http://wiki.blender.org/index.php/Dev:Ref/
Thanks:
Campbell Barton, Dalai Felinto and Ton Roosendaal for the code review
and design comments
Jonathan Williamson, Pawel Lyczkowski, Pablo Vazquez among others for
suggestions during the development.
Special Thanks to Sean Olson, for his support, suggestions, testing and
merciless bugging so that I would finish the pie menu code. Without him
we wouldn't be here. Also to the rest of the developers of the original
python add-on, Patrick Moore and Dan Eicher and finally to Matt Ebb, who
did the research and first implementation and whose code I used to get
started.
2014-08-11 10:39:59 +02:00
|
|
|
* 4
|
|
|
|
* 5 6
|
|
|
|
* 1 2
|
|
|
|
* 7 8
|
|
|
|
* 3
|
2015-05-31 14:20:03 +10:00
|
|
|
* </pre>
|
Pie Menus C code backend.
This commit merges the code in the pie-menu branch.
As per decisions taken the last few days, there are no pie menus
included and there will be an official add-on including overrides of
some keys with pie menus. However, people will now be able to use the
new code in python.
Full Documentation is in http://wiki.blender.org/index.php/Dev:Ref/
Thanks:
Campbell Barton, Dalai Felinto and Ton Roosendaal for the code review
and design comments
Jonathan Williamson, Pawel Lyczkowski, Pablo Vazquez among others for
suggestions during the development.
Special Thanks to Sean Olson, for his support, suggestions, testing and
merciless bugging so that I would finish the pie menu code. Without him
we wouldn't be here. Also to the rest of the developers of the original
python add-on, Patrick Moore and Dan Eicher and finally to Matt Ebb, who
did the research and first implementation and whose code I used to get
started.
2014-08-11 10:39:59 +02:00
|
|
|
*
|
|
|
|
* but it's actually quite logical. It's designed to be 'upwards compatible'
|
|
|
|
* for muscle memory so that the menu item locations are fixed and don't move
|
|
|
|
* as new items are added to the menu later on. It also optimises efficiency -
|
|
|
|
* a radial menu is best kept symmetrical, with as large an angle between
|
|
|
|
* items as possible, so that the gestural mouse movements can be fast and inexact.
|
2016-07-15 02:36:21 +10:00
|
|
|
*
|
Pie Menus C code backend.
This commit merges the code in the pie-menu branch.
As per decisions taken the last few days, there are no pie menus
included and there will be an official add-on including overrides of
some keys with pie menus. However, people will now be able to use the
new code in python.
Full Documentation is in http://wiki.blender.org/index.php/Dev:Ref/
Thanks:
Campbell Barton, Dalai Felinto and Ton Roosendaal for the code review
and design comments
Jonathan Williamson, Pawel Lyczkowski, Pablo Vazquez among others for
suggestions during the development.
Special Thanks to Sean Olson, for his support, suggestions, testing and
merciless bugging so that I would finish the pie menu code. Without him
we wouldn't be here. Also to the rest of the developers of the original
python add-on, Patrick Moore and Dan Eicher and finally to Matt Ebb, who
did the research and first implementation and whose code I used to get
started.
2014-08-11 10:39:59 +02:00
|
|
|
* It starts off with two opposite sides for the first two items
|
|
|
|
* then joined by the one below for the third (this way, even with three items,
|
|
|
|
* the menu seems to still be 'in order' reading left to right). Then the fourth is
|
|
|
|
* added to complete the compass directions. From here, it's just a matter of
|
|
|
|
* subdividing the rest of the angles for the last 4 items.
|
|
|
|
*
|
|
|
|
* --Matt 07/2006
|
|
|
|
*/
|
|
|
|
const char ui_radial_dir_order[8] = {
|
|
|
|
UI_RADIAL_W, UI_RADIAL_E, UI_RADIAL_S, UI_RADIAL_N,
|
|
|
|
UI_RADIAL_NW, UI_RADIAL_NE, UI_RADIAL_SW, UI_RADIAL_SE};
|
|
|
|
|
|
|
|
const char ui_radial_dir_to_numpad[8] = {8, 9, 6, 3, 2, 1, 4, 7};
|
2014-08-13 15:11:19 +02:00
|
|
|
const short ui_radial_dir_to_angle[8] = {90, 45, 0, 315, 270, 225, 180, 135};
|
Pie Menus C code backend.
This commit merges the code in the pie-menu branch.
As per decisions taken the last few days, there are no pie menus
included and there will be an official add-on including overrides of
some keys with pie menus. However, people will now be able to use the
new code in python.
Full Documentation is in http://wiki.blender.org/index.php/Dev:Ref/
Thanks:
Campbell Barton, Dalai Felinto and Ton Roosendaal for the code review
and design comments
Jonathan Williamson, Pawel Lyczkowski, Pablo Vazquez among others for
suggestions during the development.
Special Thanks to Sean Olson, for his support, suggestions, testing and
merciless bugging so that I would finish the pie menu code. Without him
we wouldn't be here. Also to the rest of the developers of the original
python add-on, Patrick Moore and Dan Eicher and finally to Matt Ebb, who
did the research and first implementation and whose code I used to get
started.
2014-08-11 10:39:59 +02:00
|
|
|
|
|
|
|
static void ui_but_pie_direction_string(uiBut *but, char *buf, int size)
|
|
|
|
{
|
|
|
|
BLI_assert(but->pie_dir < ARRAY_SIZE(ui_radial_dir_to_numpad));
|
|
|
|
BLI_snprintf(buf, size, "%d", ui_radial_dir_to_numpad[but->pie_dir]);
|
|
|
|
}
|
|
|
|
|
2013-06-01 05:53:44 +00:00
|
|
|
static void ui_menu_block_set_keymaps(const bContext *C, uiBlock *block)
|
|
|
|
{
|
|
|
|
uiBut *but;
|
|
|
|
char buf[128];
|
|
|
|
|
2018-05-13 10:25:18 +02:00
|
|
|
BLI_assert(block->flag & (UI_BLOCK_LOOP | UI_BLOCK_SHOW_SHORTCUT_ALWAYS));
|
2017-07-27 11:49:41 +02:00
|
|
|
|
2013-06-01 05:53:44 +00:00
|
|
|
/* only do it before bounding */
|
2018-09-18 17:44:14 +02:00
|
|
|
if (block->rect.xmin != block->rect.xmax) {
|
|
|
|
return;
|
|
|
|
}
|
|
|
|
if (STREQ(block->name, "splash")) {
|
2013-06-01 05:53:44 +00:00
|
|
|
return;
|
2018-09-18 17:44:14 +02:00
|
|
|
}
|
2013-06-01 05:53:44 +00:00
|
|
|
|
Pie Menus C code backend.
This commit merges the code in the pie-menu branch.
As per decisions taken the last few days, there are no pie menus
included and there will be an official add-on including overrides of
some keys with pie menus. However, people will now be able to use the
new code in python.
Full Documentation is in http://wiki.blender.org/index.php/Dev:Ref/
Thanks:
Campbell Barton, Dalai Felinto and Ton Roosendaal for the code review
and design comments
Jonathan Williamson, Pawel Lyczkowski, Pablo Vazquez among others for
suggestions during the development.
Special Thanks to Sean Olson, for his support, suggestions, testing and
merciless bugging so that I would finish the pie menu code. Without him
we wouldn't be here. Also to the rest of the developers of the original
python add-on, Patrick Moore and Dan Eicher and finally to Matt Ebb, who
did the research and first implementation and whose code I used to get
started.
2014-08-11 10:39:59 +02:00
|
|
|
if (block->flag & UI_BLOCK_RADIAL) {
|
|
|
|
for (but = block->buttons.first; but; but = but->next) {
|
|
|
|
if (but->pie_dir != UI_RADIAL_NONE) {
|
|
|
|
ui_but_pie_direction_string(but, buf, sizeof(buf));
|
|
|
|
ui_but_add_shortcut(but, buf, false);
|
|
|
|
}
|
2013-06-01 05:53:44 +00:00
|
|
|
}
|
Pie Menus C code backend.
This commit merges the code in the pie-menu branch.
As per decisions taken the last few days, there are no pie menus
included and there will be an official add-on including overrides of
some keys with pie menus. However, people will now be able to use the
new code in python.
Full Documentation is in http://wiki.blender.org/index.php/Dev:Ref/
Thanks:
Campbell Barton, Dalai Felinto and Ton Roosendaal for the code review
and design comments
Jonathan Williamson, Pawel Lyczkowski, Pablo Vazquez among others for
suggestions during the development.
Special Thanks to Sean Olson, for his support, suggestions, testing and
merciless bugging so that I would finish the pie menu code. Without him
we wouldn't be here. Also to the rest of the developers of the original
python add-on, Patrick Moore and Dan Eicher and finally to Matt Ebb, who
did the research and first implementation and whose code I used to get
started.
2014-08-11 10:39:59 +02:00
|
|
|
}
|
|
|
|
else {
|
|
|
|
for (but = block->buttons.first; but; but = but->next) {
|
2018-05-13 10:25:18 +02:00
|
|
|
if (block->flag & UI_BLOCK_SHOW_SHORTCUT_ALWAYS) {
|
|
|
|
/* Skip icon-only buttons (as used in the toolbar). */
|
|
|
|
if (but->drawstr[0] == '\0') {
|
|
|
|
continue;
|
|
|
|
}
|
2018-09-03 13:47:14 +10:00
|
|
|
else if (((block->flag & UI_BLOCK_POPOVER) == 0) && UI_but_is_tool(but)) {
|
2019-01-15 23:24:20 +11:00
|
|
|
/* For non-popovers, shown in shortcut only
|
|
|
|
* (has special shortcut handling code). */
|
2018-09-03 13:47:14 +10:00
|
|
|
continue;
|
|
|
|
}
|
2018-05-13 10:25:18 +02:00
|
|
|
}
|
|
|
|
else if (but->dt != UI_EMBOSS_PULLDOWN) {
|
2017-07-27 11:49:41 +02:00
|
|
|
continue;
|
|
|
|
}
|
Pie Menus C code backend.
This commit merges the code in the pie-menu branch.
As per decisions taken the last few days, there are no pie menus
included and there will be an official add-on including overrides of
some keys with pie menus. However, people will now be able to use the
new code in python.
Full Documentation is in http://wiki.blender.org/index.php/Dev:Ref/
Thanks:
Campbell Barton, Dalai Felinto and Ton Roosendaal for the code review
and design comments
Jonathan Williamson, Pawel Lyczkowski, Pablo Vazquez among others for
suggestions during the development.
Special Thanks to Sean Olson, for his support, suggestions, testing and
merciless bugging so that I would finish the pie menu code. Without him
we wouldn't be here. Also to the rest of the developers of the original
python add-on, Patrick Moore and Dan Eicher and finally to Matt Ebb, who
did the research and first implementation and whose code I used to get
started.
2014-08-11 10:39:59 +02:00
|
|
|
|
|
|
|
if (ui_but_event_operator_string(C, but, buf, sizeof(buf))) {
|
|
|
|
ui_but_add_shortcut(but, buf, false);
|
|
|
|
}
|
|
|
|
else if (ui_but_event_property_operator_string(C, but, buf, sizeof(buf))) {
|
|
|
|
ui_but_add_shortcut(but, buf, false);
|
|
|
|
}
|
2013-07-07 12:53:15 +00:00
|
|
|
}
|
2013-06-01 05:53:44 +00:00
|
|
|
}
|
2009-01-15 04:13:38 +00:00
|
|
|
}
|
|
|
|
|
2017-11-29 15:47:37 +01:00
|
|
|
void ui_but_override_flag(uiBut *but)
|
|
|
|
{
|
2018-05-09 16:17:15 +02:00
|
|
|
const int override_status = RNA_property_static_override_status(&but->rnapoin, but->rnaprop, but->rnaindex);
|
2017-11-29 15:47:37 +01:00
|
|
|
|
2018-03-14 11:47:35 +01:00
|
|
|
if (override_status & RNA_OVERRIDE_STATUS_OVERRIDDEN) {
|
2017-11-29 15:47:37 +01:00
|
|
|
but->flag |= UI_BUT_OVERRIDEN;
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
but->flag &= ~UI_BUT_OVERRIDEN;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_update_from_old(const bContext *C, uiBlock *block)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2014-06-15 01:40:15 +10:00
|
|
|
uiBut *but_old;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
uiBut *but;
|
|
|
|
|
2014-06-15 01:40:15 +10:00
|
|
|
if (!block->oldblock)
|
|
|
|
return;
|
|
|
|
|
|
|
|
but_old = block->oldblock->buttons.first;
|
2014-02-08 09:03:25 +11:00
|
|
|
|
2014-06-15 01:40:15 +10:00
|
|
|
if (BLI_listbase_is_empty(&block->oldblock->butstore) == false) {
|
2014-02-08 09:03:25 +11:00
|
|
|
UI_butstore_update(block);
|
|
|
|
}
|
|
|
|
|
2014-06-15 01:40:15 +10:00
|
|
|
for (but = block->buttons.first; but; but = but->next) {
|
|
|
|
if (ui_but_update_from_old_block(C, block, &but, &but_old)) {
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update(but);
|
2016-02-02 14:13:57 +11:00
|
|
|
|
|
|
|
/* redraw dynamic tooltip if we have one open */
|
|
|
|
if (but->tip_func) {
|
|
|
|
UI_but_tooltip_refresh((bContext *)C, but);
|
|
|
|
}
|
2014-06-15 01:40:15 +10:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
block->auto_open = block->oldblock->auto_open;
|
|
|
|
block->auto_open_last = block->oldblock->auto_open_last;
|
|
|
|
block->tooltipdisabled = block->oldblock->tooltipdisabled;
|
2014-11-06 20:19:21 +01:00
|
|
|
BLI_movelisttolist(&block->color_pickers.list, &block->oldblock->color_pickers.list);
|
2014-06-15 01:40:15 +10:00
|
|
|
|
|
|
|
block->oldblock = NULL;
|
|
|
|
}
|
|
|
|
|
2017-10-12 14:43:45 +02:00
|
|
|
void UI_block_end_ex(const bContext *C, uiBlock *block, const int xy[2], int r_xy[2])
|
2014-06-15 01:40:15 +10:00
|
|
|
{
|
|
|
|
wmWindow *window = CTX_wm_window(C);
|
|
|
|
Scene *scene = CTX_data_scene(C);
|
2018-06-11 14:33:53 +02:00
|
|
|
ARegion *region = CTX_wm_region(C);
|
2014-06-15 01:40:15 +10:00
|
|
|
uiBut *but;
|
|
|
|
|
|
|
|
BLI_assert(block->active);
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
UI_block_update_from_old(C, block);
|
2014-06-15 01:40:15 +10:00
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
/* inherit flags from 'old' buttons that was drawn here previous, based
|
|
|
|
* on matching buttons, we need this to make button event handling non
|
2011-10-15 14:14:22 +00:00
|
|
|
* blocking, while still allowing buttons to be remade each redraw as it
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
* is expected by blender code */
|
2012-03-30 01:51:25 +00:00
|
|
|
for (but = block->buttons.first; but; but = but->next) {
|
2012-07-03 19:09:07 +00:00
|
|
|
/* temp? Proper check for graying out */
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->optype) {
|
2012-03-30 01:51:25 +00:00
|
|
|
wmOperatorType *ot = but->optype;
|
2009-06-03 00:09:30 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->context)
|
2012-03-30 01:51:25 +00:00
|
|
|
CTX_store_set((bContext *)C, but->context);
|
2009-06-03 00:09:30 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
if (ot == NULL || WM_operator_poll_context((bContext *)C, ot, but->opcontext) == 0) {
|
2009-02-18 18:08:33 +00:00
|
|
|
but->flag |= UI_BUT_DISABLED;
|
|
|
|
}
|
2009-06-03 00:09:30 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->context)
|
2012-03-30 01:51:25 +00:00
|
|
|
CTX_store_set((bContext *)C, NULL);
|
2009-02-18 18:08:33 +00:00
|
|
|
}
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
ui_but_anim_flag(but, (scene) ? scene->r.cfra : 0.0f);
|
2017-11-29 15:47:37 +01:00
|
|
|
ui_but_override_flag(but);
|
2018-06-16 18:26:34 +02:00
|
|
|
if (UI_but_is_decorator(but)) {
|
|
|
|
ui_but_anim_decorate_update_from_flag(but);
|
|
|
|
}
|
2009-02-18 18:08:33 +00:00
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
|
2009-01-15 04:13:38 +00:00
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
/* handle pending stuff */
|
2012-09-10 07:03:30 +00:00
|
|
|
if (block->layouts.first) {
|
2014-11-09 21:20:40 +01:00
|
|
|
UI_block_layout_resolve(block, NULL, NULL);
|
2012-09-10 07:03:30 +00:00
|
|
|
}
|
UI: New Global Top-Bar (WIP)
== Main Features/Changes for Users
* Add horizontal bar at top of all non-temp windows, consisting out of two horizontal sub-bars.
* Upper sub-bar contains global menus (File, Render, etc.), tabs for workspaces and scene selector.
* Lower sub-bar contains object mode selector, screen-layout and render-layer selector. Later operator and/or tool settings will be placed here.
* Individual sections of the topbar are individually scrollable.
* Workspace tabs can be double- or ctrl-clicked for renaming and contain 'x' icon for deleting.
* Top-bar should scale nicely with DPI.
* The lower half of the top-bar can be hided by dragging the lower top-bar edge up. Better hiding options are planned (e.g. hide in fullscreen modes).
* Info editors at the top of the window and using the full window width with be replaced by the top-bar.
* In fullscreen modes, no more info editor is added on top, the top-bar replaces it.
== Technical Features/Changes
* Adds initial support for global areas
A global area is part of the window, not part of the regular screen-layout.
I've added a macro iterator to iterate over both, global and screen-layout level areas. When iterating over areas, from now on developers should always consider if they have to include global areas.
* Adds a TOPBAR editor type
The editor type is hidden in the UI editor type menu.
* Adds a variation of the ID template to display IDs as tab buttons (template_ID_tabs in BPY)
* Does various changes to RNA button creation code to improve their appearance in the horizontal top-bar.
* Adds support for dynamically sized regions. That is, regions that scale automatically to the layout bounds.
The code for this is currently a big hack (it's based on drawing the UI multiple times). This should definitely be improved.
* Adds a template for displaying operator properties optimized for the top-bar. This will probably change a lot still and is in fact disabled in code.
Since the final top-bar design depends a lot on other 2.8 designs (mainly tool-system and workspaces), we decided to not show the operator or tool settings in the top-bar for now. That means most of the lower sub-bar is empty for the time being.
NOTE: Top-bar or global area data is not written to files or SDNA. They are simply added to the window when opening Blender or reading a file. This allows us doing changes to the top-bar without having to care for compatibility.
== ToDo's
It's a bit hard to predict all the ToDo's here are the known main ones:
* Add options for the new active-tool system and for operator redo to the topbar.
* Automatically hide the top-bar in fullscreen modes.
* General visual polish.
* Top-bar drag & drop support (WIP in temp-tab_drag_drop).
* Improve dynamic regions (should also fix some layout glitches).
* Make internal terminology consistent.
* Enable topbar file writing once design is more advanced.
* Address TODO's and XXX's in code :)
Thanks @brecht for the review! And @sergey for the complaining ;)
Differential Revision: D2758
2018-04-20 17:14:03 +02:00
|
|
|
ui_block_align_calc(block, CTX_wm_region(C));
|
2012-03-24 06:38:07 +00:00
|
|
|
if ((block->flag & UI_BLOCK_LOOP) && (block->flag & UI_BLOCK_NUMSELECT)) {
|
2011-12-22 00:03:20 +00:00
|
|
|
ui_menu_block_set_keyaccels(block); /* could use a different flag to check */
|
|
|
|
}
|
2012-09-10 07:03:30 +00:00
|
|
|
|
2018-05-13 10:25:18 +02:00
|
|
|
if (block->flag & (UI_BLOCK_LOOP | UI_BLOCK_SHOW_SHORTCUT_ALWAYS)) {
|
2012-09-10 07:03:30 +00:00
|
|
|
ui_menu_block_set_keymaps(C, block);
|
|
|
|
}
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
/* after keymaps! */
|
2012-09-10 07:03:30 +00:00
|
|
|
switch (block->bounds_type) {
|
|
|
|
case UI_BLOCK_BOUNDS_NONE:
|
|
|
|
break;
|
|
|
|
case UI_BLOCK_BOUNDS:
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_block_bounds_calc(block);
|
2012-09-10 07:03:30 +00:00
|
|
|
break;
|
|
|
|
case UI_BLOCK_BOUNDS_TEXT:
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_block_bounds_calc_text(block, 0.0f);
|
2012-09-10 07:03:30 +00:00
|
|
|
break;
|
|
|
|
case UI_BLOCK_BOUNDS_POPUP_CENTER:
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_block_bounds_calc_centered(window, block);
|
2012-09-10 07:03:30 +00:00
|
|
|
break;
|
Pie Menus C code backend.
This commit merges the code in the pie-menu branch.
As per decisions taken the last few days, there are no pie menus
included and there will be an official add-on including overrides of
some keys with pie menus. However, people will now be able to use the
new code in python.
Full Documentation is in http://wiki.blender.org/index.php/Dev:Ref/
Thanks:
Campbell Barton, Dalai Felinto and Ton Roosendaal for the code review
and design comments
Jonathan Williamson, Pawel Lyczkowski, Pablo Vazquez among others for
suggestions during the development.
Special Thanks to Sean Olson, for his support, suggestions, testing and
merciless bugging so that I would finish the pie menu code. Without him
we wouldn't be here. Also to the rest of the developers of the original
python add-on, Patrick Moore and Dan Eicher and finally to Matt Ebb, who
did the research and first implementation and whose code I used to get
started.
2014-08-11 10:39:59 +02:00
|
|
|
case UI_BLOCK_BOUNDS_PIE_CENTER:
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_block_bounds_calc_centered_pie(block);
|
Pie Menus C code backend.
This commit merges the code in the pie-menu branch.
As per decisions taken the last few days, there are no pie menus
included and there will be an official add-on including overrides of
some keys with pie menus. However, people will now be able to use the
new code in python.
Full Documentation is in http://wiki.blender.org/index.php/Dev:Ref/
Thanks:
Campbell Barton, Dalai Felinto and Ton Roosendaal for the code review
and design comments
Jonathan Williamson, Pawel Lyczkowski, Pablo Vazquez among others for
suggestions during the development.
Special Thanks to Sean Olson, for his support, suggestions, testing and
merciless bugging so that I would finish the pie menu code. Without him
we wouldn't be here. Also to the rest of the developers of the original
python add-on, Patrick Moore and Dan Eicher and finally to Matt Ebb, who
did the research and first implementation and whose code I used to get
started.
2014-08-11 10:39:59 +02:00
|
|
|
break;
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
|
2012-09-10 07:03:30 +00:00
|
|
|
/* fallback */
|
|
|
|
case UI_BLOCK_BOUNDS_POPUP_MOUSE:
|
|
|
|
case UI_BLOCK_BOUNDS_POPUP_MENU:
|
2017-10-12 14:43:45 +02:00
|
|
|
ui_block_bounds_calc_popup(window, block, block->bounds_type, xy, r_xy);
|
2012-09-10 07:03:30 +00:00
|
|
|
break;
|
|
|
|
}
|
|
|
|
|
|
|
|
if (block->rect.xmin == 0.0f && block->rect.xmax == 0.0f) {
|
2014-11-09 21:20:40 +01:00
|
|
|
UI_block_bounds_set_normal(block, 0);
|
2012-09-10 07:03:30 +00:00
|
|
|
}
|
|
|
|
if (block->flag & UI_BUT_ALIGN) {
|
2014-11-09 21:20:40 +01:00
|
|
|
UI_block_align_end(block);
|
2012-09-10 07:03:30 +00:00
|
|
|
}
|
2009-01-28 23:29:27 +00:00
|
|
|
|
2018-06-11 14:33:53 +02:00
|
|
|
ui_update_flexible_spacing(region, block);
|
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
block->endblock = 1;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_end(const bContext *C, uiBlock *block)
|
2014-06-15 01:40:15 +10:00
|
|
|
{
|
|
|
|
wmWindow *window = CTX_wm_window(C);
|
|
|
|
|
2017-10-12 14:43:45 +02:00
|
|
|
UI_block_end_ex(C, block, &window->eventstate->x, NULL);
|
2014-06-15 01:40:15 +10:00
|
|
|
}
|
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
/* ************** BLOCK DRAWING FUNCTION ************* */
|
|
|
|
|
2009-04-15 14:43:54 +00:00
|
|
|
void ui_fontscale(short *points, float aspect)
|
2009-04-10 14:06:24 +00:00
|
|
|
{
|
2012-03-24 06:38:07 +00:00
|
|
|
if (aspect < 0.9f || aspect > 1.1f) {
|
2012-03-30 01:51:25 +00:00
|
|
|
float pointsf = *points;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2009-04-10 14:06:24 +00:00
|
|
|
/* for some reason scaling fonts goes too fast compared to widget size */
|
Holiday coding log :)
Nice formatted version (pictures soon):
http://wiki.blender.org/index.php/Dev:Ref/Release_Notes/2.66/Usability
Short list of main changes:
- Transparent region option (over main region), added code to blend in/out such panels.
- Min size window now 640 x 480
- Fixed DPI for ui - lots of cleanup and changes everywhere. Icon image need correct size still, layer-in-use icon needs remake.
- Macbook retina support, use command line --no-native-pixels to disable it
- Timeline Marker label was drawing wrong
- Trackpad and magic mouse: supports zoom (hold ctrl)
- Fix for splash position: removed ghost function and made window size update after creation immediate
- Fast undo buffer save now adds UI as well. Could be checked for regular file save even...
Quit.blend and temp file saving use this now.
- Dixed filename in window on reading quit.blend or temp saves, and they now add a warning in window title: "(Recovered)"
- New Userpref option "Keep Session" - this always saves quit.blend, and loads on start.
This allows keeping UI and data without actual saves, until you actually save.
When you load startup.blend and quit, it recognises the quit.blend as a startup (no file name in header)
- Added 3D view copy/paste buffers (selected objects). Shortcuts ctrl-c, ctrl-v (OSX, cmd-c, cmd-v).
Coded partial file saving for it. Could be used for other purposes. Todo: use OS clipboards.
- User preferences (themes, keymaps, user settings) now can be saved as a separate file.
Old option is called "Save Startup File" the new one "Save User Settings".
To visualise this difference, the 'save startup file' button has been removed from user preferences window. That option is available as CTRL+U and in File menu still.
- OSX: fixed bug that stopped giving mouse events outside window.
This also fixes "Continuous Grab" for OSX. (error since 2009)
2012-12-12 18:58:11 +00:00
|
|
|
/* XXX not true anymore? (ton) */
|
|
|
|
//aspect = sqrt(aspect);
|
2009-04-10 14:06:24 +00:00
|
|
|
pointsf /= aspect;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (aspect > 1.0f)
|
2012-03-30 01:51:25 +00:00
|
|
|
*points = ceilf(pointsf);
|
2009-04-10 14:06:24 +00:00
|
|
|
else
|
2012-03-30 01:51:25 +00:00
|
|
|
*points = floorf(pointsf);
|
2009-04-10 14:06:24 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2009-04-09 18:11:18 +00:00
|
|
|
/* project button or block (but==NULL) to pixels in regionspace */
|
|
|
|
static void ui_but_to_pixelrect(rcti *rect, const ARegion *ar, uiBlock *block, uiBut *but)
|
|
|
|
{
|
2013-06-13 09:12:53 +00:00
|
|
|
rctf rectf;
|
|
|
|
|
|
|
|
ui_block_to_window_rctf(ar, block, &rectf, (but) ? &but->rect : &block->rect);
|
2017-06-23 11:04:58 +03:00
|
|
|
BLI_rcti_rctf_copy_round(rect, &rectf);
|
2015-08-18 14:16:58 +10:00
|
|
|
BLI_rcti_translate(rect, -ar->winrct.xmin, -ar->winrct.ymin);
|
2009-04-09 18:11:18 +00:00
|
|
|
}
|
|
|
|
|
2009-04-10 14:06:24 +00:00
|
|
|
/* uses local copy of style, to scale things down, and allow widgets to change stuff */
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_draw(const bContext *C, uiBlock *block)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2014-11-09 21:20:40 +01:00
|
|
|
uiStyle style = *UI_style_get_dpi(); /* XXX pass on as arg */
|
2009-03-25 20:49:15 +00:00
|
|
|
ARegion *ar;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
uiBut *but;
|
2009-04-09 18:11:18 +00:00
|
|
|
rcti rect;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2009-03-25 20:49:15 +00:00
|
|
|
/* get menu region or area region */
|
2012-03-30 01:51:25 +00:00
|
|
|
ar = CTX_wm_menu(C);
|
2012-03-24 06:38:07 +00:00
|
|
|
if (!ar)
|
2012-03-30 01:51:25 +00:00
|
|
|
ar = CTX_wm_region(C);
|
2009-03-25 20:49:15 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (!block->endblock)
|
2014-11-09 21:20:40 +01:00
|
|
|
UI_block_end(C, block);
|
2009-01-28 23:29:27 +00:00
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
/* we set this only once */
|
2018-06-27 19:07:23 -06:00
|
|
|
GPU_blend_set_func_separate(GPU_SRC_ALPHA, GPU_ONE_MINUS_SRC_ALPHA, GPU_ONE, GPU_ONE_MINUS_SRC_ALPHA);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2009-04-10 14:06:24 +00:00
|
|
|
/* scale fonts */
|
|
|
|
ui_fontscale(&style.paneltitle.points, block->aspect);
|
|
|
|
ui_fontscale(&style.grouplabel.points, block->aspect);
|
|
|
|
ui_fontscale(&style.widgetlabel.points, block->aspect);
|
|
|
|
ui_fontscale(&style.widget.points, block->aspect);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2009-04-10 14:06:24 +00:00
|
|
|
/* scale block min/max to rect */
|
|
|
|
ui_but_to_pixelrect(&rect, ar, block, NULL);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2009-04-07 17:08:26 +00:00
|
|
|
/* pixel space for AA widgets */
|
2018-07-15 15:27:15 +02:00
|
|
|
GPU_matrix_push_projection();
|
|
|
|
GPU_matrix_push();
|
|
|
|
GPU_matrix_identity_set();
|
2014-09-10 13:24:31 +10:00
|
|
|
|
2016-02-29 15:20:09 +01:00
|
|
|
wmOrtho2_region_pixelspace(ar);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2009-04-09 18:11:18 +00:00
|
|
|
/* back */
|
Pie Menus C code backend.
This commit merges the code in the pie-menu branch.
As per decisions taken the last few days, there are no pie menus
included and there will be an official add-on including overrides of
some keys with pie menus. However, people will now be able to use the
new code in python.
Full Documentation is in http://wiki.blender.org/index.php/Dev:Ref/
Thanks:
Campbell Barton, Dalai Felinto and Ton Roosendaal for the code review
and design comments
Jonathan Williamson, Pawel Lyczkowski, Pablo Vazquez among others for
suggestions during the development.
Special Thanks to Sean Olson, for his support, suggestions, testing and
merciless bugging so that I would finish the pie menu code. Without him
we wouldn't be here. Also to the rest of the developers of the original
python add-on, Patrick Moore and Dan Eicher and finally to Matt Ebb, who
did the research and first implementation and whose code I used to get
started.
2014-08-11 10:39:59 +02:00
|
|
|
if (block->flag & UI_BLOCK_RADIAL)
|
|
|
|
ui_draw_pie_center(block);
|
2018-04-22 22:08:48 +02:00
|
|
|
else if (block->flag & UI_BLOCK_POPOVER)
|
2018-04-28 22:54:11 +02:00
|
|
|
ui_draw_popover_back(ar, &style, block, &rect);
|
Pie Menus C code backend.
This commit merges the code in the pie-menu branch.
As per decisions taken the last few days, there are no pie menus
included and there will be an official add-on including overrides of
some keys with pie menus. However, people will now be able to use the
new code in python.
Full Documentation is in http://wiki.blender.org/index.php/Dev:Ref/
Thanks:
Campbell Barton, Dalai Felinto and Ton Roosendaal for the code review
and design comments
Jonathan Williamson, Pawel Lyczkowski, Pablo Vazquez among others for
suggestions during the development.
Special Thanks to Sean Olson, for his support, suggestions, testing and
merciless bugging so that I would finish the pie menu code. Without him
we wouldn't be here. Also to the rest of the developers of the original
python add-on, Patrick Moore and Dan Eicher and finally to Matt Ebb, who
did the research and first implementation and whose code I used to get
started.
2014-08-11 10:39:59 +02:00
|
|
|
else if (block->flag & UI_BLOCK_LOOP)
|
2009-04-10 14:06:24 +00:00
|
|
|
ui_draw_menu_back(&style, block, &rect);
|
2018-09-11 11:22:08 +10:00
|
|
|
else if (block->panel) {
|
|
|
|
bool show_background = ar->alignment != RGN_ALIGN_FLOAT;
|
|
|
|
ui_draw_aligned_panel(
|
|
|
|
&style, block, &rect,
|
|
|
|
UI_panel_category_is_visible(ar), show_background);
|
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2018-03-31 13:09:03 +02:00
|
|
|
BLF_batch_draw_begin();
|
2018-03-31 19:32:28 +02:00
|
|
|
UI_icon_draw_cache_begin();
|
2018-04-06 14:30:20 +02:00
|
|
|
UI_widgetbase_draw_cache_begin();
|
2018-03-30 21:09:24 +02:00
|
|
|
|
2009-04-09 18:11:18 +00:00
|
|
|
/* widgets */
|
2012-03-30 01:51:25 +00:00
|
|
|
for (but = block->buttons.first; but; but = but->next) {
|
|
|
|
if (!(but->flag & (UI_HIDDEN | UI_SCROLLED))) {
|
Code holiday commit:
- fix: user pref, window title was reset to 'Blender' on tab usage
- Undo history menu back:
- name "Undo History"
- hotkey alt+ctrl+z (alt+apple+z for mac)
- works like 2.4x, only for global undo, editmode and particle edit.
- Menu scroll
- for small windows or screens, popup menus now allow to display
all items, using internal scrolling
- works with a timer, scrolling 10 items per second when mouse
is over the top or bottom arrow
- if menu is too big to display, it now draws to top or bottom,
based on largest available space.
- also works for hotkey driven pop up menus.
- User pref "DPI" follows widget/layout size
- widgets & headers now become bigger and smaller, to match
'dpi' font sizes. Works well to match UI to monitor size.
- note that icons can get fuzzy, we need better mipmaps for it
2011-06-04 17:03:46 +00:00
|
|
|
ui_but_to_pixelrect(&rect, ar, block, but);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2010-09-21 14:39:18 +00:00
|
|
|
/* XXX: figure out why invalid coordinates happen when closing render window */
|
|
|
|
/* and material preview is redrawn in main window (temp fix for bug #23848) */
|
2012-03-24 06:38:07 +00:00
|
|
|
if (rect.xmin < rect.xmax && rect.ymin < rect.ymax)
|
Code holiday commit:
- fix: user pref, window title was reset to 'Blender' on tab usage
- Undo history menu back:
- name "Undo History"
- hotkey alt+ctrl+z (alt+apple+z for mac)
- works like 2.4x, only for global undo, editmode and particle edit.
- Menu scroll
- for small windows or screens, popup menus now allow to display
all items, using internal scrolling
- works with a timer, scrolling 10 items per second when mouse
is over the top or bottom arrow
- if menu is too big to display, it now draws to top or bottom,
based on largest available space.
- also works for hotkey driven pop up menus.
- User pref "DPI" follows widget/layout size
- widgets & headers now become bigger and smaller, to match
'dpi' font sizes. Works well to match UI to monitor size.
- note that icons can get fuzzy, we need better mipmaps for it
2011-06-04 17:03:46 +00:00
|
|
|
ui_draw_but(C, ar, &style, but, &rect);
|
|
|
|
}
|
2009-04-09 18:11:18 +00:00
|
|
|
}
|
2018-03-30 21:09:24 +02:00
|
|
|
|
2018-04-06 14:30:20 +02:00
|
|
|
UI_widgetbase_draw_cache_end();
|
2018-03-31 19:32:28 +02:00
|
|
|
UI_icon_draw_cache_end();
|
2018-03-31 13:09:03 +02:00
|
|
|
BLF_batch_draw_end();
|
2011-10-31 14:08:14 +00:00
|
|
|
|
2017-04-19 10:26:40 +02:00
|
|
|
/* restore matrix */
|
2018-07-15 15:27:15 +02:00
|
|
|
GPU_matrix_pop_projection();
|
|
|
|
GPU_matrix_pop();
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2017-11-13 19:43:34 +11:00
|
|
|
static void ui_block_message_subscribe(ARegion *ar, struct wmMsgBus *mbus, uiBlock *block)
|
|
|
|
{
|
|
|
|
uiBut *but_prev = NULL;
|
|
|
|
/* possibly we should keep the region this block is contained in? */
|
|
|
|
for (uiBut *but = block->buttons.first; but; but = but->next) {
|
|
|
|
if (but->rnapoin.type && but->rnaprop) {
|
|
|
|
/* quick check to avoid adding buttons representing a vector, multiple times. */
|
|
|
|
if ((but_prev &&
|
2018-11-21 06:21:58 +11:00
|
|
|
(but_prev->rnaprop == but->rnaprop) &&
|
|
|
|
(but_prev->rnapoin.type == but->rnapoin.type) &&
|
|
|
|
(but_prev->rnapoin.data == but->rnapoin.data) &&
|
|
|
|
(but_prev->rnapoin.id.data == but->rnapoin.id.data)) == false)
|
2017-11-13 19:43:34 +11:00
|
|
|
{
|
|
|
|
/* TODO: could make this into utility function. */
|
|
|
|
WM_msg_subscribe_rna(
|
|
|
|
mbus, &but->rnapoin, but->rnaprop,
|
|
|
|
&(const wmMsgSubscribeValue){
|
|
|
|
.owner = ar,
|
|
|
|
.user_data = ar,
|
|
|
|
.notify = ED_region_do_msg_notify_tag_redraw,
|
|
|
|
}, __func__);
|
|
|
|
but_prev = but;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
void UI_region_message_subscribe(ARegion *ar, struct wmMsgBus *mbus)
|
|
|
|
{
|
|
|
|
for (uiBlock *block = ar->uiblocks.first; block; block = block->next) {
|
|
|
|
ui_block_message_subscribe(ar, mbus, block);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
/* ************* EVENTS ************* */
|
|
|
|
|
2014-01-20 11:13:53 +11:00
|
|
|
/**
|
|
|
|
* Check if the button is pushed, this is only meaningful for some button types.
|
|
|
|
*
|
2014-01-22 02:48:11 +11:00
|
|
|
* \return (0 == UNSELECT), (1 == SELECT), (-1 == DO-NOTHING)
|
2014-01-20 11:13:53 +11:00
|
|
|
*/
|
2014-11-09 21:20:40 +01:00
|
|
|
int ui_but_is_pushed_ex(uiBut *but, double *value)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2014-04-11 11:25:41 +10:00
|
|
|
int is_push = 0;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
if (but->bit) {
|
2014-11-09 21:20:40 +01:00
|
|
|
const bool state = ELEM(but->type, UI_BTYPE_TOGGLE_N, UI_BTYPE_ICON_TOGGLE_N, UI_BTYPE_CHECKBOX_N) ? false : true;
|
2011-03-18 05:42:16 +00:00
|
|
|
int lvalue;
|
2012-05-27 12:21:13 +00:00
|
|
|
UI_GET_BUT_VALUE_INIT(but, *value);
|
2012-03-30 01:51:25 +00:00
|
|
|
lvalue = (int)*value;
|
2012-09-03 10:12:25 +00:00
|
|
|
if (UI_BITBUT_TEST(lvalue, (but->bitnr))) {
|
2013-03-01 00:19:32 +00:00
|
|
|
is_push = state;
|
2012-09-03 10:12:25 +00:00
|
|
|
}
|
|
|
|
else {
|
2013-03-01 00:19:32 +00:00
|
|
|
is_push = !state;
|
2012-09-03 10:12:25 +00:00
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
else {
|
2012-03-30 01:51:25 +00:00
|
|
|
switch (but->type) {
|
2014-11-09 21:20:40 +01:00
|
|
|
case UI_BTYPE_BUT:
|
|
|
|
case UI_BTYPE_HOTKEY_EVENT:
|
|
|
|
case UI_BTYPE_KEY_EVENT:
|
|
|
|
case UI_BTYPE_COLOR:
|
2013-03-01 00:19:32 +00:00
|
|
|
is_push = -1;
|
2012-03-30 01:51:25 +00:00
|
|
|
break;
|
2014-11-09 21:20:40 +01:00
|
|
|
case UI_BTYPE_BUT_TOGGLE:
|
|
|
|
case UI_BTYPE_TOGGLE:
|
|
|
|
case UI_BTYPE_ICON_TOGGLE:
|
|
|
|
case UI_BTYPE_CHECKBOX:
|
2012-05-27 12:21:13 +00:00
|
|
|
UI_GET_BUT_VALUE_INIT(but, *value);
|
2013-03-01 00:19:32 +00:00
|
|
|
if (*value != (double)but->hardmin) is_push = true;
|
2012-03-30 01:51:25 +00:00
|
|
|
break;
|
2014-11-09 21:20:40 +01:00
|
|
|
case UI_BTYPE_ICON_TOGGLE_N:
|
|
|
|
case UI_BTYPE_TOGGLE_N:
|
|
|
|
case UI_BTYPE_CHECKBOX_N:
|
2012-05-27 12:21:13 +00:00
|
|
|
UI_GET_BUT_VALUE_INIT(but, *value);
|
2013-03-01 00:19:32 +00:00
|
|
|
if (*value == 0.0) is_push = true;
|
2012-03-30 01:51:25 +00:00
|
|
|
break;
|
2014-11-09 21:20:40 +01:00
|
|
|
case UI_BTYPE_ROW:
|
|
|
|
case UI_BTYPE_LISTROW:
|
UI: New Global Top-Bar (WIP)
== Main Features/Changes for Users
* Add horizontal bar at top of all non-temp windows, consisting out of two horizontal sub-bars.
* Upper sub-bar contains global menus (File, Render, etc.), tabs for workspaces and scene selector.
* Lower sub-bar contains object mode selector, screen-layout and render-layer selector. Later operator and/or tool settings will be placed here.
* Individual sections of the topbar are individually scrollable.
* Workspace tabs can be double- or ctrl-clicked for renaming and contain 'x' icon for deleting.
* Top-bar should scale nicely with DPI.
* The lower half of the top-bar can be hided by dragging the lower top-bar edge up. Better hiding options are planned (e.g. hide in fullscreen modes).
* Info editors at the top of the window and using the full window width with be replaced by the top-bar.
* In fullscreen modes, no more info editor is added on top, the top-bar replaces it.
== Technical Features/Changes
* Adds initial support for global areas
A global area is part of the window, not part of the regular screen-layout.
I've added a macro iterator to iterate over both, global and screen-layout level areas. When iterating over areas, from now on developers should always consider if they have to include global areas.
* Adds a TOPBAR editor type
The editor type is hidden in the UI editor type menu.
* Adds a variation of the ID template to display IDs as tab buttons (template_ID_tabs in BPY)
* Does various changes to RNA button creation code to improve their appearance in the horizontal top-bar.
* Adds support for dynamically sized regions. That is, regions that scale automatically to the layout bounds.
The code for this is currently a big hack (it's based on drawing the UI multiple times). This should definitely be improved.
* Adds a template for displaying operator properties optimized for the top-bar. This will probably change a lot still and is in fact disabled in code.
Since the final top-bar design depends a lot on other 2.8 designs (mainly tool-system and workspaces), we decided to not show the operator or tool settings in the top-bar for now. That means most of the lower sub-bar is empty for the time being.
NOTE: Top-bar or global area data is not written to files or SDNA. They are simply added to the window when opening Blender or reading a file. This allows us doing changes to the top-bar without having to care for compatibility.
== ToDo's
It's a bit hard to predict all the ToDo's here are the known main ones:
* Add options for the new active-tool system and for operator redo to the topbar.
* Automatically hide the top-bar in fullscreen modes.
* General visual polish.
* Top-bar drag & drop support (WIP in temp-tab_drag_drop).
* Improve dynamic regions (should also fix some layout glitches).
* Make internal terminology consistent.
* Enable topbar file writing once design is more advanced.
* Address TODO's and XXX's in code :)
Thanks @brecht for the review! And @sergey for the complaining ;)
Differential Revision: D2758
2018-04-20 17:14:03 +02:00
|
|
|
case UI_BTYPE_TAB:
|
2018-10-29 21:20:58 +01:00
|
|
|
if ((but->type == UI_BTYPE_TAB) && but->rnaprop && but->custom_data) {
|
UI: New Global Top-Bar (WIP)
== Main Features/Changes for Users
* Add horizontal bar at top of all non-temp windows, consisting out of two horizontal sub-bars.
* Upper sub-bar contains global menus (File, Render, etc.), tabs for workspaces and scene selector.
* Lower sub-bar contains object mode selector, screen-layout and render-layer selector. Later operator and/or tool settings will be placed here.
* Individual sections of the topbar are individually scrollable.
* Workspace tabs can be double- or ctrl-clicked for renaming and contain 'x' icon for deleting.
* Top-bar should scale nicely with DPI.
* The lower half of the top-bar can be hided by dragging the lower top-bar edge up. Better hiding options are planned (e.g. hide in fullscreen modes).
* Info editors at the top of the window and using the full window width with be replaced by the top-bar.
* In fullscreen modes, no more info editor is added on top, the top-bar replaces it.
== Technical Features/Changes
* Adds initial support for global areas
A global area is part of the window, not part of the regular screen-layout.
I've added a macro iterator to iterate over both, global and screen-layout level areas. When iterating over areas, from now on developers should always consider if they have to include global areas.
* Adds a TOPBAR editor type
The editor type is hidden in the UI editor type menu.
* Adds a variation of the ID template to display IDs as tab buttons (template_ID_tabs in BPY)
* Does various changes to RNA button creation code to improve their appearance in the horizontal top-bar.
* Adds support for dynamically sized regions. That is, regions that scale automatically to the layout bounds.
The code for this is currently a big hack (it's based on drawing the UI multiple times). This should definitely be improved.
* Adds a template for displaying operator properties optimized for the top-bar. This will probably change a lot still and is in fact disabled in code.
Since the final top-bar design depends a lot on other 2.8 designs (mainly tool-system and workspaces), we decided to not show the operator or tool settings in the top-bar for now. That means most of the lower sub-bar is empty for the time being.
NOTE: Top-bar or global area data is not written to files or SDNA. They are simply added to the window when opening Blender or reading a file. This allows us doing changes to the top-bar without having to care for compatibility.
== ToDo's
It's a bit hard to predict all the ToDo's here are the known main ones:
* Add options for the new active-tool system and for operator redo to the topbar.
* Automatically hide the top-bar in fullscreen modes.
* General visual polish.
* Top-bar drag & drop support (WIP in temp-tab_drag_drop).
* Improve dynamic regions (should also fix some layout glitches).
* Make internal terminology consistent.
* Enable topbar file writing once design is more advanced.
* Address TODO's and XXX's in code :)
Thanks @brecht for the review! And @sergey for the complaining ;)
Differential Revision: D2758
2018-04-20 17:14:03 +02:00
|
|
|
/* uiBut.custom_data points to data this tab represents (e.g. workspace).
|
|
|
|
* uiBut.rnapoin/prop store an active value (e.g. active workspace). */
|
|
|
|
if (RNA_property_type(but->rnaprop) == PROP_POINTER) {
|
|
|
|
PointerRNA active_ptr = RNA_property_pointer_get(&but->rnapoin, but->rnaprop);
|
|
|
|
if (active_ptr.data == but->custom_data) {
|
|
|
|
is_push = true;
|
|
|
|
}
|
|
|
|
}
|
2018-10-29 21:20:58 +01:00
|
|
|
break;
|
|
|
|
}
|
|
|
|
else if (but->optype) {
|
|
|
|
break;
|
|
|
|
}
|
|
|
|
|
|
|
|
UI_GET_BUT_VALUE_INIT(but, *value);
|
|
|
|
/* support for rna enum buts */
|
|
|
|
if (but->rnaprop && (RNA_property_flag(but->rnaprop) & PROP_ENUM_FLAG)) {
|
|
|
|
if ((int)*value & (int)but->hardmax) is_push = true;
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
if (*value == (double)but->hardmax) is_push = true;
|
UI: New Global Top-Bar (WIP)
== Main Features/Changes for Users
* Add horizontal bar at top of all non-temp windows, consisting out of two horizontal sub-bars.
* Upper sub-bar contains global menus (File, Render, etc.), tabs for workspaces and scene selector.
* Lower sub-bar contains object mode selector, screen-layout and render-layer selector. Later operator and/or tool settings will be placed here.
* Individual sections of the topbar are individually scrollable.
* Workspace tabs can be double- or ctrl-clicked for renaming and contain 'x' icon for deleting.
* Top-bar should scale nicely with DPI.
* The lower half of the top-bar can be hided by dragging the lower top-bar edge up. Better hiding options are planned (e.g. hide in fullscreen modes).
* Info editors at the top of the window and using the full window width with be replaced by the top-bar.
* In fullscreen modes, no more info editor is added on top, the top-bar replaces it.
== Technical Features/Changes
* Adds initial support for global areas
A global area is part of the window, not part of the regular screen-layout.
I've added a macro iterator to iterate over both, global and screen-layout level areas. When iterating over areas, from now on developers should always consider if they have to include global areas.
* Adds a TOPBAR editor type
The editor type is hidden in the UI editor type menu.
* Adds a variation of the ID template to display IDs as tab buttons (template_ID_tabs in BPY)
* Does various changes to RNA button creation code to improve their appearance in the horizontal top-bar.
* Adds support for dynamically sized regions. That is, regions that scale automatically to the layout bounds.
The code for this is currently a big hack (it's based on drawing the UI multiple times). This should definitely be improved.
* Adds a template for displaying operator properties optimized for the top-bar. This will probably change a lot still and is in fact disabled in code.
Since the final top-bar design depends a lot on other 2.8 designs (mainly tool-system and workspaces), we decided to not show the operator or tool settings in the top-bar for now. That means most of the lower sub-bar is empty for the time being.
NOTE: Top-bar or global area data is not written to files or SDNA. They are simply added to the window when opening Blender or reading a file. This allows us doing changes to the top-bar without having to care for compatibility.
== ToDo's
It's a bit hard to predict all the ToDo's here are the known main ones:
* Add options for the new active-tool system and for operator redo to the topbar.
* Automatically hide the top-bar in fullscreen modes.
* General visual polish.
* Top-bar drag & drop support (WIP in temp-tab_drag_drop).
* Improve dynamic regions (should also fix some layout glitches).
* Make internal terminology consistent.
* Enable topbar file writing once design is more advanced.
* Address TODO's and XXX's in code :)
Thanks @brecht for the review! And @sergey for the complaining ;)
Differential Revision: D2758
2018-04-20 17:14:03 +02:00
|
|
|
}
|
|
|
|
break;
|
2012-03-30 01:51:25 +00:00
|
|
|
default:
|
2013-03-01 00:19:32 +00:00
|
|
|
is_push = -1;
|
2012-03-30 01:51:25 +00:00
|
|
|
break;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
}
|
2013-03-01 00:19:32 +00:00
|
|
|
|
|
|
|
return is_push;
|
|
|
|
}
|
2014-11-09 21:20:40 +01:00
|
|
|
int ui_but_is_pushed(uiBut *but)
|
2013-03-01 00:19:32 +00:00
|
|
|
{
|
|
|
|
double value = UI_BUT_VALUE_UNSET;
|
2014-11-09 21:20:40 +01:00
|
|
|
return ui_but_is_pushed_ex(but, &value);
|
2013-03-01 00:19:32 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
static void ui_but_update_select_flag(uiBut *but, double *value)
|
2013-03-01 00:19:32 +00:00
|
|
|
{
|
2014-11-09 21:20:40 +01:00
|
|
|
switch (ui_but_is_pushed_ex(but, value)) {
|
2013-03-01 00:19:32 +00:00
|
|
|
case true:
|
|
|
|
but->flag |= UI_SELECT;
|
|
|
|
break;
|
|
|
|
case false:
|
|
|
|
but->flag &= ~UI_SELECT;
|
|
|
|
break;
|
2012-10-07 09:48:59 +00:00
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
|
|
|
/* ************************************************ */
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_lock_set(uiBlock *block, bool val, const char *lockstr)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-03-24 06:38:07 +00:00
|
|
|
if (val) {
|
2013-04-04 02:05:11 +00:00
|
|
|
block->lock = val;
|
2012-03-30 01:51:25 +00:00
|
|
|
block->lockstr = lockstr;
|
2.5: ID datablock button back, previously known as std_libbuttons. The
way this worked in 2.4x wasn't really clean, with events going all over
the place and using dubious variables such as G.but->lockpoin or
G.sima->menunr. It works as follows now, for example:
xco= uiDefIDPoinButs(block, CTX_data_main(C), NULL, (ID**)&sima->image, ID_IM, &sima->pin, xco, yco,
sima_idpoin_handle, UI_ID_BROWSE|UI_ID_RENAME|UI_ID_ADD_NEW|UI_ID_OPEN|UI_ID_DELETE|UI_ID_ALONE|UI_ID_PIN);
The last two parameters are a callback function, and a list of events
or functionalities that are supported. The callback function will then
get the ID pointer + event to handle.
2009-02-06 16:40:14 +00:00
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_lock_clear(uiBlock *block)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2013-04-04 02:05:11 +00:00
|
|
|
block->lock = false;
|
2012-03-30 01:51:25 +00:00
|
|
|
block->lockstr = NULL;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
|
|
|
/* *********************** data get/set ***********************
|
|
|
|
* this either works with the pointed to data, or can work with
|
|
|
|
* an edit override pointer while dragging for example */
|
|
|
|
|
2018-04-17 17:20:34 +02:00
|
|
|
/* Get PointerRNA which will point to a data inside of an evaluated
|
|
|
|
* ID datablock.
|
|
|
|
*/
|
|
|
|
static PointerRNA ui_but_evaluated_rnapoin_get(uiBut *but)
|
|
|
|
{
|
|
|
|
BLI_assert(but->rnaprop != NULL);
|
|
|
|
/* TODO(sergey): evil_C sounds.. EVIL! Any clear way to avoid this? */
|
|
|
|
PointerRNA rnapoin_eval = but->rnapoin;
|
|
|
|
/* If there is no animation or drivers, it doesn't matter if we read value
|
|
|
|
* from evaluated datablock or from original one.
|
|
|
|
*
|
|
|
|
* Reading from original one is much faster, since we don't need to do any
|
|
|
|
* PointerRNA remapping or hash lookup.
|
|
|
|
*/
|
|
|
|
if (BKE_animdata_from_id(but->rnapoin.id.data) == NULL) {
|
|
|
|
return rnapoin_eval;
|
|
|
|
}
|
|
|
|
/* Same goes for the properties which can not be animated. */
|
|
|
|
if (!RNA_property_animateable(&but->rnapoin, but->rnaprop)) {
|
|
|
|
return rnapoin_eval;
|
|
|
|
}
|
|
|
|
Depsgraph *depsgraph = CTX_data_depsgraph(but->block->evil_C);
|
|
|
|
/* ID pointer we can always remap, they are inside of depsgraph. */
|
|
|
|
rnapoin_eval.id.data =
|
|
|
|
DEG_get_evaluated_id(depsgraph, rnapoin_eval.id.data);
|
|
|
|
/* Some of ID datablocks do not have their evaluated copy inside
|
|
|
|
* of dependency graph. If it's such datablock, no need to worry about
|
|
|
|
* data pointer.
|
|
|
|
*/
|
|
|
|
if (rnapoin_eval.id.data == but->rnapoin.id.data) {
|
|
|
|
return rnapoin_eval;
|
|
|
|
}
|
|
|
|
/* For the data pointer it's getting a bit more involved, since it can
|
|
|
|
* whether be and ID, or can be a property deep inside of ID.
|
|
|
|
*
|
|
|
|
* We start from checking if it's an ID, since that is the less involved
|
|
|
|
* code path, and probably is executed in most of the cases.
|
|
|
|
*/
|
|
|
|
if (but->rnapoin.data == but->rnapoin.id.data) {
|
|
|
|
rnapoin_eval.data = DEG_get_evaluated_id(depsgraph, rnapoin_eval.data);
|
|
|
|
return rnapoin_eval;
|
|
|
|
}
|
|
|
|
/* We aren't as lucky as we thought we are :(
|
|
|
|
*
|
|
|
|
* Since we don't know what the property is, we get it's RNA path
|
|
|
|
* relative to the original ID, and then we decent down from evaluated
|
|
|
|
* ID to the same property.
|
|
|
|
*
|
|
|
|
* This seems to be most straightforward way to get sub-data pointers
|
|
|
|
* which can be buried deep inside of ID block.
|
|
|
|
*/
|
|
|
|
const char *rna_path =
|
|
|
|
RNA_path_from_ID_to_property(&but->rnapoin, but->rnaprop);
|
|
|
|
if (rna_path != NULL) {
|
|
|
|
PointerRNA id_ptr;
|
|
|
|
RNA_id_pointer_create(rnapoin_eval.id.data, &id_ptr);
|
|
|
|
if (!RNA_path_resolve_full(&id_ptr,
|
|
|
|
rna_path,
|
|
|
|
&rnapoin_eval,
|
|
|
|
NULL, NULL))
|
|
|
|
{
|
|
|
|
/* TODO(sergey): Anything to do here to recover? */
|
|
|
|
}
|
|
|
|
MEM_freeN((void *)rna_path);
|
|
|
|
}
|
|
|
|
return rnapoin_eval;
|
|
|
|
}
|
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
/* for buttons pointing to color for example */
|
2014-11-09 21:20:40 +01:00
|
|
|
void ui_but_v3_get(uiBut *but, float vec[3])
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2008-12-30 21:28:27 +00:00
|
|
|
PropertyRNA *prop;
|
2012-12-21 00:11:45 +00:00
|
|
|
int a;
|
2008-12-30 21:28:27 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->editvec) {
|
2011-11-07 01:38:32 +00:00
|
|
|
copy_v3_v3(vec, but->editvec);
|
2008-12-30 21:28:27 +00:00
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->rnaprop) {
|
2012-03-30 01:51:25 +00:00
|
|
|
prop = but->rnaprop;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-12-21 00:11:45 +00:00
|
|
|
zero_v3(vec);
|
2008-12-30 21:28:27 +00:00
|
|
|
|
2018-04-17 17:20:34 +02:00
|
|
|
PointerRNA rnapoin_eval = ui_but_evaluated_rnapoin_get(but);
|
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (RNA_property_type(prop) == PROP_FLOAT) {
|
2018-04-17 17:20:34 +02:00
|
|
|
int tot = RNA_property_array_length(&rnapoin_eval, prop);
|
2012-12-21 00:11:45 +00:00
|
|
|
BLI_assert(tot > 0);
|
|
|
|
if (tot == 3) {
|
2018-04-17 17:20:34 +02:00
|
|
|
RNA_property_float_get_array(&rnapoin_eval, prop, vec);
|
2012-12-21 00:11:45 +00:00
|
|
|
}
|
|
|
|
else {
|
|
|
|
tot = min_ii(tot, 3);
|
|
|
|
for (a = 0; a < tot; a++) {
|
2018-04-17 17:20:34 +02:00
|
|
|
vec[a] = RNA_property_float_get_index(&rnapoin_eval, prop, a);
|
2012-12-21 00:11:45 +00:00
|
|
|
}
|
|
|
|
}
|
2008-12-30 21:28:27 +00:00
|
|
|
}
|
|
|
|
}
|
2012-09-12 00:32:33 +00:00
|
|
|
else if (but->pointype == UI_BUT_POIN_CHAR) {
|
2014-04-27 00:22:49 +10:00
|
|
|
const char *cp = (char *)but->poin;
|
2012-12-21 00:11:45 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
vec[0] = ((float)cp[0]) / 255.0f;
|
|
|
|
vec[1] = ((float)cp[1]) / 255.0f;
|
|
|
|
vec[2] = ((float)cp[2]) / 255.0f;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-09-12 00:32:33 +00:00
|
|
|
else if (but->pointype == UI_BUT_POIN_FLOAT) {
|
2014-04-27 00:22:49 +10:00
|
|
|
const float *fp = (float *)but->poin;
|
2011-11-07 01:38:32 +00:00
|
|
|
copy_v3_v3(vec, fp);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2010-03-22 09:30:00 +00:00
|
|
|
else {
|
2012-03-30 01:51:25 +00:00
|
|
|
if (but->editvec == NULL) {
|
2012-12-21 00:11:45 +00:00
|
|
|
fprintf(stderr, "%s: can't get color, should never happen\n", __func__);
|
|
|
|
zero_v3(vec);
|
2010-03-22 09:30:00 +00:00
|
|
|
}
|
|
|
|
}
|
2011-10-15 09:43:42 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
if (but->type == UI_BTYPE_UNITVEC) {
|
2011-10-15 09:43:42 +00:00
|
|
|
normalize_v3(vec);
|
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
|
|
|
/* for buttons pointing to color for example */
|
2014-11-09 21:20:40 +01:00
|
|
|
void ui_but_v3_set(uiBut *but, const float vec[3])
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2008-12-30 21:28:27 +00:00
|
|
|
PropertyRNA *prop;
|
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->editvec) {
|
2011-10-15 09:43:42 +00:00
|
|
|
copy_v3_v3(but->editvec, vec);
|
2008-12-30 21:28:27 +00:00
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->rnaprop) {
|
2012-03-30 01:51:25 +00:00
|
|
|
prop = but->rnaprop;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (RNA_property_type(prop) == PROP_FLOAT) {
|
2011-10-15 09:59:31 +00:00
|
|
|
int tot;
|
|
|
|
int a;
|
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
tot = RNA_property_array_length(&but->rnapoin, prop);
|
2012-12-21 00:11:45 +00:00
|
|
|
BLI_assert(tot > 0);
|
|
|
|
if (tot == 3) {
|
|
|
|
RNA_property_float_set_array(&but->rnapoin, prop, vec);
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
tot = min_ii(tot, 3);
|
|
|
|
for (a = 0; a < tot; a++) {
|
|
|
|
RNA_property_float_set_index(&but->rnapoin, prop, a, vec[a]);
|
|
|
|
}
|
2011-10-15 09:59:31 +00:00
|
|
|
}
|
2008-12-30 21:28:27 +00:00
|
|
|
}
|
|
|
|
}
|
2012-09-12 00:32:33 +00:00
|
|
|
else if (but->pointype == UI_BUT_POIN_CHAR) {
|
2012-03-30 01:51:25 +00:00
|
|
|
char *cp = (char *)but->poin;
|
|
|
|
cp[0] = (char)(0.5f + vec[0] * 255.0f);
|
|
|
|
cp[1] = (char)(0.5f + vec[1] * 255.0f);
|
|
|
|
cp[2] = (char)(0.5f + vec[2] * 255.0f);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-09-12 00:32:33 +00:00
|
|
|
else if (but->pointype == UI_BUT_POIN_FLOAT) {
|
2012-03-30 01:51:25 +00:00
|
|
|
float *fp = (float *)but->poin;
|
2011-10-15 09:43:42 +00:00
|
|
|
copy_v3_v3(fp, vec);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
bool ui_but_is_float(const uiBut *but)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
if (but->pointype == UI_BUT_POIN_FLOAT && but->poin)
|
2013-04-04 02:05:11 +00:00
|
|
|
return true;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->rnaprop && RNA_property_type(but->rnaprop) == PROP_FLOAT)
|
2013-04-04 02:05:11 +00:00
|
|
|
return true;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2013-04-04 02:05:11 +00:00
|
|
|
return false;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
bool ui_but_is_bool(const uiBut *but)
|
2013-02-22 05:56:20 +00:00
|
|
|
{
|
2018-11-22 21:10:12 +01:00
|
|
|
if (ELEM(but->type, UI_BTYPE_TOGGLE, UI_BTYPE_TOGGLE_N, UI_BTYPE_ICON_TOGGLE, UI_BTYPE_ICON_TOGGLE_N, UI_BTYPE_TAB))
|
2013-04-04 02:05:11 +00:00
|
|
|
return true;
|
2013-02-22 05:56:20 +00:00
|
|
|
|
|
|
|
if (but->rnaprop && RNA_property_type(but->rnaprop) == PROP_BOOLEAN)
|
2013-04-04 02:05:11 +00:00
|
|
|
return true;
|
2013-02-22 05:56:20 +00:00
|
|
|
|
2014-12-10 17:00:54 +01:00
|
|
|
if ((but->rnaprop && RNA_property_type(but->rnaprop) == PROP_ENUM) && (but->type == UI_BTYPE_ROW))
|
|
|
|
return true;
|
|
|
|
|
2013-04-04 02:05:11 +00:00
|
|
|
return false;
|
2013-02-22 05:56:20 +00:00
|
|
|
}
|
|
|
|
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
bool ui_but_is_unit(const uiBut *but)
|
2009-08-12 08:16:10 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
UnitSettings *unit = but->block->unit;
|
2014-11-09 21:20:40 +01:00
|
|
|
const int unit_type = UI_but_unit_type_get(but);
|
2010-09-16 04:19:22 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (unit_type == PROP_UNIT_NONE)
|
2013-04-04 02:05:11 +00:00
|
|
|
return false;
|
2010-09-16 04:19:22 +00:00
|
|
|
|
2012-08-19 10:41:27 +00:00
|
|
|
#if 1 /* removed so angle buttons get correct snapping */
|
2014-11-09 21:20:40 +01:00
|
|
|
if (ui_but_is_unit_radians_ex(unit, unit_type))
|
2013-04-04 02:05:11 +00:00
|
|
|
return false;
|
2010-09-16 04:19:22 +00:00
|
|
|
#endif
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2012-10-21 05:46:41 +00:00
|
|
|
/* for now disable time unit conversion */
|
2010-02-15 21:07:14 +00:00
|
|
|
if (unit_type == PROP_UNIT_TIME)
|
2013-04-04 02:05:11 +00:00
|
|
|
return false;
|
2010-02-15 21:07:14 +00:00
|
|
|
|
2011-08-18 20:01:30 +00:00
|
|
|
if (unit->system == USER_UNIT_NONE) {
|
2010-12-10 13:15:11 +00:00
|
|
|
if (unit_type != PROP_UNIT_ROTATION) {
|
2013-04-04 02:05:11 +00:00
|
|
|
return false;
|
2010-12-10 13:15:11 +00:00
|
|
|
}
|
2010-01-25 06:24:05 +00:00
|
|
|
}
|
2009-08-12 08:16:10 +00:00
|
|
|
|
2013-04-04 02:05:11 +00:00
|
|
|
return true;
|
2009-08-12 08:16:10 +00:00
|
|
|
}
|
|
|
|
|
2014-02-08 09:34:01 +11:00
|
|
|
/**
|
|
|
|
* Check if this button is similar enough to be grouped with another.
|
|
|
|
*/
|
2014-11-09 21:20:40 +01:00
|
|
|
bool ui_but_is_compatible(const uiBut *but_a, const uiBut *but_b)
|
2014-02-08 09:34:01 +11:00
|
|
|
{
|
|
|
|
if (but_a->type != but_b->type)
|
|
|
|
return false;
|
|
|
|
if (but_a->pointype != but_b->pointype)
|
|
|
|
return false;
|
|
|
|
|
|
|
|
if (but_a->rnaprop) {
|
2014-10-29 22:35:09 +01:00
|
|
|
/* skip 'rnapoin.data', 'rnapoin.id.data'
|
|
|
|
* allow different data to have the same props edited at once */
|
2014-02-08 09:34:01 +11:00
|
|
|
if (but_a->rnapoin.type != but_b->rnapoin.type)
|
|
|
|
return false;
|
|
|
|
if (RNA_property_type(but_a->rnaprop) != RNA_property_type(but_b->rnaprop))
|
|
|
|
return false;
|
|
|
|
if (RNA_property_subtype(but_a->rnaprop) != RNA_property_subtype(but_b->rnaprop))
|
|
|
|
return false;
|
|
|
|
}
|
|
|
|
|
|
|
|
return true;
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
bool ui_but_is_rna_valid(uiBut *but)
|
2011-03-24 09:27:41 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
if (but->rnaprop == NULL || RNA_struct_contains_property(&but->rnapoin, but->rnaprop)) {
|
2013-04-04 02:05:11 +00:00
|
|
|
return true;
|
2011-03-24 09:27:41 +00:00
|
|
|
}
|
|
|
|
else {
|
|
|
|
printf("property removed %s: %p\n", but->drawstr, but->rnaprop);
|
2013-04-04 02:05:11 +00:00
|
|
|
return false;
|
2011-03-24 09:27:41 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2015-09-05 19:28:24 +02:00
|
|
|
/**
|
|
|
|
* Checks if the button supports ctrl+mousewheel cycling
|
|
|
|
*/
|
|
|
|
bool ui_but_supports_cycling(const uiBut *but)
|
|
|
|
{
|
|
|
|
return ((ELEM(but->type, UI_BTYPE_ROW, UI_BTYPE_NUM, UI_BTYPE_NUM_SLIDER, UI_BTYPE_LISTBOX)) ||
|
|
|
|
(but->type == UI_BTYPE_MENU && ui_but_menu_step_poll(but)) ||
|
2015-11-16 06:26:25 +11:00
|
|
|
(but->type == UI_BTYPE_COLOR && but->a1 != -1) ||
|
|
|
|
(but->menu_step_func != NULL));
|
2015-09-05 19:28:24 +02:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
double ui_but_value_get(uiBut *but)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
|
|
|
PropertyRNA *prop;
|
|
|
|
double value = 0.0;
|
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->editval) { return *(but->editval); }
|
2012-03-30 01:51:25 +00:00
|
|
|
if (but->poin == NULL && but->rnapoin.data == NULL) return 0.0;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->rnaprop) {
|
2012-03-30 01:51:25 +00:00
|
|
|
prop = but->rnaprop;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2013-03-06 16:32:05 +00:00
|
|
|
BLI_assert(but->rnaindex != -1);
|
|
|
|
|
2018-04-17 17:20:34 +02:00
|
|
|
PointerRNA rnapoin_eval = ui_but_evaluated_rnapoin_get(but);
|
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
switch (RNA_property_type(prop)) {
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
case PROP_BOOLEAN:
|
2012-03-24 06:38:07 +00:00
|
|
|
if (RNA_property_array_check(prop))
|
2018-04-17 17:20:34 +02:00
|
|
|
value = RNA_property_boolean_get_index(&rnapoin_eval, prop, but->rnaindex);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
else
|
2018-04-17 17:20:34 +02:00
|
|
|
value = RNA_property_boolean_get(&rnapoin_eval, prop);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
break;
|
|
|
|
case PROP_INT:
|
2012-03-24 06:38:07 +00:00
|
|
|
if (RNA_property_array_check(prop))
|
2018-04-17 17:20:34 +02:00
|
|
|
value = RNA_property_int_get_index(&rnapoin_eval, prop, but->rnaindex);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
else
|
2018-04-17 17:20:34 +02:00
|
|
|
value = RNA_property_int_get(&rnapoin_eval, prop);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
break;
|
|
|
|
case PROP_FLOAT:
|
2012-03-24 06:38:07 +00:00
|
|
|
if (RNA_property_array_check(prop))
|
2018-04-17 17:20:34 +02:00
|
|
|
value = RNA_property_float_get_index(&rnapoin_eval, prop, but->rnaindex);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
else
|
2018-04-17 17:20:34 +02:00
|
|
|
value = RNA_property_float_get(&rnapoin_eval, prop);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
break;
|
|
|
|
case PROP_ENUM:
|
2018-04-17 17:20:34 +02:00
|
|
|
value = RNA_property_enum_get(&rnapoin_eval, prop);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
break;
|
|
|
|
default:
|
2012-03-30 01:51:25 +00:00
|
|
|
value = 0.0;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
break;
|
|
|
|
}
|
|
|
|
}
|
2012-09-12 00:32:33 +00:00
|
|
|
else if (but->pointype == UI_BUT_POIN_CHAR) {
|
2012-03-30 01:51:25 +00:00
|
|
|
value = *(char *)but->poin;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-09-12 00:32:33 +00:00
|
|
|
else if (but->pointype == UI_BUT_POIN_SHORT) {
|
2012-03-30 01:51:25 +00:00
|
|
|
value = *(short *)but->poin;
|
2012-10-21 05:46:41 +00:00
|
|
|
}
|
2012-09-12 00:32:33 +00:00
|
|
|
else if (but->pointype == UI_BUT_POIN_INT) {
|
2012-03-30 01:51:25 +00:00
|
|
|
value = *(int *)but->poin;
|
2012-10-21 05:46:41 +00:00
|
|
|
}
|
2012-09-12 00:32:33 +00:00
|
|
|
else if (but->pointype == UI_BUT_POIN_FLOAT) {
|
2012-03-30 01:51:25 +00:00
|
|
|
value = *(float *)but->poin;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2011-04-21 13:11:51 +00:00
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
return value;
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void ui_but_value_set(uiBut *but, double value)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
|
|
|
PropertyRNA *prop;
|
|
|
|
|
|
|
|
/* value is a hsv value: convert to rgb */
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->rnaprop) {
|
2012-03-30 01:51:25 +00:00
|
|
|
prop = but->rnaprop;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (RNA_property_editable(&but->rnapoin, prop)) {
|
2012-03-30 01:51:25 +00:00
|
|
|
switch (RNA_property_type(prop)) {
|
2008-11-24 12:12:24 +00:00
|
|
|
case PROP_BOOLEAN:
|
2013-09-16 01:35:52 +00:00
|
|
|
if (RNA_property_array_check(prop))
|
2009-02-02 19:57:57 +00:00
|
|
|
RNA_property_boolean_set_index(&but->rnapoin, prop, but->rnaindex, value);
|
2008-11-24 12:12:24 +00:00
|
|
|
else
|
2008-11-24 15:51:55 +00:00
|
|
|
RNA_property_boolean_set(&but->rnapoin, prop, value);
|
2008-11-24 12:12:24 +00:00
|
|
|
break;
|
|
|
|
case PROP_INT:
|
2013-09-16 01:35:52 +00:00
|
|
|
if (RNA_property_array_check(prop))
|
2010-12-08 11:42:11 +00:00
|
|
|
RNA_property_int_set_index(&but->rnapoin, prop, but->rnaindex, (int)value);
|
2008-11-24 12:12:24 +00:00
|
|
|
else
|
2010-12-08 11:42:11 +00:00
|
|
|
RNA_property_int_set(&but->rnapoin, prop, (int)value);
|
2008-11-24 12:12:24 +00:00
|
|
|
break;
|
|
|
|
case PROP_FLOAT:
|
2013-09-16 01:35:52 +00:00
|
|
|
if (RNA_property_array_check(prop))
|
2009-02-02 19:57:57 +00:00
|
|
|
RNA_property_float_set_index(&but->rnapoin, prop, but->rnaindex, value);
|
2008-11-24 12:12:24 +00:00
|
|
|
else
|
2008-11-24 15:51:55 +00:00
|
|
|
RNA_property_float_set(&but->rnapoin, prop, value);
|
2008-11-24 12:12:24 +00:00
|
|
|
break;
|
|
|
|
case PROP_ENUM:
|
2012-03-24 06:38:07 +00:00
|
|
|
if (RNA_property_flag(prop) & PROP_ENUM_FLAG) {
|
2012-03-30 01:51:25 +00:00
|
|
|
int ivalue = (int)value;
|
2019-01-15 23:24:20 +11:00
|
|
|
/* toggle for enum/flag buttons */
|
|
|
|
ivalue ^= RNA_property_enum_get(&but->rnapoin, prop);
|
2010-09-23 07:31:44 +00:00
|
|
|
RNA_property_enum_set(&but->rnapoin, prop, ivalue);
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
RNA_property_enum_set(&but->rnapoin, prop, value);
|
|
|
|
}
|
2008-11-24 12:12:24 +00:00
|
|
|
break;
|
|
|
|
default:
|
|
|
|
break;
|
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2011-06-16 06:00:02 +00:00
|
|
|
|
|
|
|
/* we can't be sure what RNA set functions actually do,
|
|
|
|
* so leave this unset */
|
2012-03-30 01:51:25 +00:00
|
|
|
value = UI_BUT_VALUE_UNSET;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-10-14 13:08:19 +00:00
|
|
|
else if (but->pointype == 0) {
|
|
|
|
/* pass */
|
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
else {
|
|
|
|
/* first do rounding */
|
2012-09-12 00:32:33 +00:00
|
|
|
if (but->pointype == UI_BUT_POIN_CHAR) {
|
2017-09-18 21:11:41 +10:00
|
|
|
value = round_db_to_uchar_clamp(value);
|
2012-09-12 00:32:33 +00:00
|
|
|
}
|
|
|
|
else if (but->pointype == UI_BUT_POIN_SHORT) {
|
2017-09-18 21:11:41 +10:00
|
|
|
value = round_db_to_short_clamp(value);
|
|
|
|
}
|
|
|
|
else if (but->pointype == UI_BUT_POIN_INT) {
|
|
|
|
value = round_db_to_int_clamp(value);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-09-12 00:32:33 +00:00
|
|
|
else if (but->pointype == UI_BUT_POIN_FLOAT) {
|
2012-03-30 01:51:25 +00:00
|
|
|
float fval = (float)value;
|
|
|
|
if (fval >= -0.00001f && fval <= 0.00001f) fval = 0.0f; /* prevent negative zero */
|
|
|
|
value = fval;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2018-05-23 10:47:12 +02:00
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
/* then set value with possible edit override */
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->editval)
|
2012-03-30 01:51:25 +00:00
|
|
|
value = *but->editval = value;
|
2012-09-12 00:32:33 +00:00
|
|
|
else if (but->pointype == UI_BUT_POIN_CHAR)
|
2012-03-30 01:51:25 +00:00
|
|
|
value = *((char *)but->poin) = (char)value;
|
2012-09-12 00:32:33 +00:00
|
|
|
else if (but->pointype == UI_BUT_POIN_SHORT)
|
2012-03-30 01:51:25 +00:00
|
|
|
value = *((short *)but->poin) = (short)value;
|
2012-09-12 00:32:33 +00:00
|
|
|
else if (but->pointype == UI_BUT_POIN_INT)
|
2012-03-30 01:51:25 +00:00
|
|
|
value = *((int *)but->poin) = (int)value;
|
2012-09-12 00:32:33 +00:00
|
|
|
else if (but->pointype == UI_BUT_POIN_FLOAT)
|
2012-03-30 01:51:25 +00:00
|
|
|
value = *((float *)but->poin) = (float)value;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update_select_flag(but, &value);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
int ui_but_string_get_max_length(uiBut *but)
|
2009-03-30 17:31:37 +00:00
|
|
|
{
|
2015-03-24 15:05:27 +11:00
|
|
|
if (ELEM(but->type, UI_BTYPE_TEXT, UI_BTYPE_SEARCH_MENU))
|
2009-03-30 17:31:37 +00:00
|
|
|
return but->hardmax;
|
|
|
|
else
|
|
|
|
return UI_MAX_DRAW_STR;
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
uiBut *ui_but_drag_multi_edit_get(uiBut *but)
|
2014-02-25 10:28:41 +11:00
|
|
|
{
|
|
|
|
uiBut *but_iter;
|
|
|
|
|
|
|
|
BLI_assert(but->flag & UI_BUT_DRAG_MULTI);
|
|
|
|
|
|
|
|
for (but_iter = but->block->buttons.first; but_iter; but_iter = but_iter->next) {
|
|
|
|
if (but_iter->editstr) {
|
|
|
|
break;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
return but_iter;
|
|
|
|
}
|
|
|
|
|
2015-04-21 14:24:13 +10:00
|
|
|
/** \name Check to show extra icons
|
|
|
|
*
|
|
|
|
* Extra icons are shown on the right hand side of buttons.
|
2016-11-28 18:59:31 +01:00
|
|
|
* This could (should!) definitely become more generic, but for now this is good enough.
|
2015-04-21 14:24:13 +10:00
|
|
|
* \{ */
|
|
|
|
|
2016-11-28 18:59:31 +01:00
|
|
|
static bool ui_but_icon_extra_is_visible_text_clear(const uiBut *but)
|
|
|
|
{
|
|
|
|
BLI_assert(but->type == UI_BTYPE_TEXT);
|
2017-01-20 18:43:42 +01:00
|
|
|
return ((but->flag & UI_BUT_VALUE_CLEAR) && but->drawstr[0]);
|
2016-11-28 18:59:31 +01:00
|
|
|
}
|
|
|
|
|
2015-04-21 14:24:13 +10:00
|
|
|
static bool ui_but_icon_extra_is_visible_search_unlink(const uiBut *but)
|
|
|
|
{
|
2018-08-23 19:58:54 +02:00
|
|
|
BLI_assert(ELEM(but->type, UI_BTYPE_SEARCH_MENU));
|
2015-04-21 14:24:13 +10:00
|
|
|
return ((but->editstr == NULL) &&
|
|
|
|
(but->drawstr[0] != '\0') &&
|
2016-11-28 18:59:31 +01:00
|
|
|
(but->flag & UI_BUT_VALUE_CLEAR));
|
2015-04-21 14:24:13 +10:00
|
|
|
}
|
|
|
|
|
2016-11-28 18:59:31 +01:00
|
|
|
static bool ui_but_icon_extra_is_visible_search_eyedropper(uiBut *but)
|
2015-04-21 14:24:13 +10:00
|
|
|
{
|
2015-04-21 15:18:16 +10:00
|
|
|
StructRNA *type;
|
|
|
|
short idcode;
|
2015-04-21 14:24:13 +10:00
|
|
|
|
2016-11-28 18:59:31 +01:00
|
|
|
BLI_assert(but->type == UI_BTYPE_SEARCH_MENU && (but->flag & UI_BUT_VALUE_CLEAR));
|
2015-04-21 14:24:13 +10:00
|
|
|
|
2015-04-21 15:18:16 +10:00
|
|
|
if (but->rnaprop == NULL) {
|
|
|
|
return false;
|
|
|
|
}
|
|
|
|
|
|
|
|
type = RNA_property_pointer_type(&but->rnapoin, but->rnaprop);
|
|
|
|
idcode = RNA_type_to_ID_code(type);
|
|
|
|
|
2015-04-21 14:24:13 +10:00
|
|
|
return ((but->editstr == NULL) &&
|
|
|
|
(idcode == ID_OB || OB_DATA_SUPPORT_ID(idcode)));
|
|
|
|
}
|
|
|
|
|
|
|
|
uiButExtraIconType ui_but_icon_extra_get(uiBut *but)
|
|
|
|
{
|
2016-11-28 18:59:31 +01:00
|
|
|
switch (but->type) {
|
|
|
|
case UI_BTYPE_TEXT:
|
|
|
|
if (ui_but_icon_extra_is_visible_text_clear(but)) {
|
|
|
|
return UI_BUT_ICONEXTRA_CLEAR;
|
|
|
|
}
|
|
|
|
break;
|
|
|
|
case UI_BTYPE_SEARCH_MENU:
|
|
|
|
if ((but->flag & UI_BUT_VALUE_CLEAR) == 0) {
|
|
|
|
/* pass */
|
|
|
|
}
|
|
|
|
else if (ui_but_icon_extra_is_visible_search_unlink(but)) {
|
|
|
|
return UI_BUT_ICONEXTRA_CLEAR;
|
|
|
|
}
|
|
|
|
else if (ui_but_icon_extra_is_visible_search_eyedropper(but)) {
|
|
|
|
return UI_BUT_ICONEXTRA_EYEDROPPER;
|
|
|
|
}
|
|
|
|
break;
|
|
|
|
default:
|
|
|
|
break;
|
2015-04-21 14:24:13 +10:00
|
|
|
}
|
|
|
|
|
|
|
|
return UI_BUT_ICONEXTRA_NONE;
|
|
|
|
}
|
|
|
|
|
|
|
|
/** \} */
|
|
|
|
|
|
|
|
|
added time units, currently only used when metric or imperial are enabled.
long/short units...
day,d, hour,hr,h, minute,min,m, second,sec,s, millisecond,ms, microsecond,us
Also may fix some bugs that were reported.
Note, to convert fps to time evil_C needs to be used to get the scene.
2009-08-12 05:20:16 +00:00
|
|
|
static double ui_get_but_scale_unit(uiBut *but, double value)
|
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
UnitSettings *unit = but->block->unit;
|
2014-11-09 21:20:40 +01:00
|
|
|
int unit_type = UI_but_unit_type_get(but);
|
added time units, currently only used when metric or imperial are enabled.
long/short units...
day,d, hour,hr,h, minute,min,m, second,sec,s, millisecond,ms, microsecond,us
Also may fix some bugs that were reported.
Note, to convert fps to time evil_C needs to be used to get the scene.
2009-08-12 05:20:16 +00:00
|
|
|
|
2014-08-26 20:52:07 +10:00
|
|
|
/* Time unit is a bit special, not handled by BKE_scene_unit_scale() for now. */
|
2014-08-26 12:04:24 +02:00
|
|
|
if (unit_type == PROP_UNIT_TIME) { /* WARNING - using evil_C :| */
|
2012-03-30 01:51:25 +00:00
|
|
|
Scene *scene = CTX_data_scene(but->block->evil_C);
|
added time units, currently only used when metric or imperial are enabled.
long/short units...
day,d, hour,hr,h, minute,min,m, second,sec,s, millisecond,ms, microsecond,us
Also may fix some bugs that were reported.
Note, to convert fps to time evil_C needs to be used to get the scene.
2009-08-12 05:20:16 +00:00
|
|
|
return FRA2TIME(value);
|
|
|
|
}
|
|
|
|
else {
|
2014-08-26 20:52:07 +10:00
|
|
|
return BKE_scene_unit_scale(unit, RNA_SUBTYPE_UNIT_VALUE(unit_type), value);
|
added time units, currently only used when metric or imperial are enabled.
long/short units...
day,d, hour,hr,h, minute,min,m, second,sec,s, millisecond,ms, microsecond,us
Also may fix some bugs that were reported.
Note, to convert fps to time evil_C needs to be used to get the scene.
2009-08-12 05:20:16 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
patch [#23758] Better handling of UTF chars in UNITS fields (lengths, angles, etc.)
from Lorenzo Tozzi (oni_niubbo) with minor edits.
--- from the tracker
The present situation is this: due to bug#22274, during editing, UTF chars are stripped from buttons with a unit associated
(length, angles, etc.).
Example: if the button displays '90°' and you click on it with LMB, the editing string will become '90'.
The problem arises if you use microns: '34µm' becomes '34' that blender interprets as 34 meters. So clicking on a button
and hitting enter won't confirm the previous value, but will change it (very badly also).
Of course nobody is using microns in blender, but the problem will arise when we will implement areas and option 'Separate
Units' will be enabled. The value '2m² 3cm²' will become '2m' during editing.
This patch solves the problem rewriting the string in a smarter way than just stripping the UTF chars: the unit is translated
from unit->name_short ('µm') to unit->name_alt ('um'). So clicking on '34µm' the editing string will become
'34um'.
--- end
note: rather then allowing empty strings in name_alt field I made it so if the unit system was the default one a NULL name_alt will just strip the string, since its the default its not needed.
2010-09-15 17:37:00 +00:00
|
|
|
/* str will be overwritten */
|
2014-11-09 21:20:40 +01:00
|
|
|
void ui_but_convert_to_unit_alt_name(uiBut *but, char *str, size_t maxlen)
|
patch [#23758] Better handling of UTF chars in UNITS fields (lengths, angles, etc.)
from Lorenzo Tozzi (oni_niubbo) with minor edits.
--- from the tracker
The present situation is this: due to bug#22274, during editing, UTF chars are stripped from buttons with a unit associated
(length, angles, etc.).
Example: if the button displays '90°' and you click on it with LMB, the editing string will become '90'.
The problem arises if you use microns: '34µm' becomes '34' that blender interprets as 34 meters. So clicking on a button
and hitting enter won't confirm the previous value, but will change it (very badly also).
Of course nobody is using microns in blender, but the problem will arise when we will implement areas and option 'Separate
Units' will be enabled. The value '2m² 3cm²' will become '2m' during editing.
This patch solves the problem rewriting the string in a smarter way than just stripping the UTF chars: the unit is translated
from unit->name_short ('µm') to unit->name_alt ('um'). So clicking on '34µm' the editing string will become
'34um'.
--- end
note: rather then allowing empty strings in name_alt field I made it so if the unit system was the default one a NULL name_alt will just strip the string, since its the default its not needed.
2010-09-15 17:37:00 +00:00
|
|
|
{
|
2014-11-09 21:20:40 +01:00
|
|
|
if (ui_but_is_unit(but)) {
|
2012-03-30 01:51:25 +00:00
|
|
|
UnitSettings *unit = but->block->unit;
|
2014-11-09 21:20:40 +01:00
|
|
|
int unit_type = UI_but_unit_type_get(but);
|
patch [#23758] Better handling of UTF chars in UNITS fields (lengths, angles, etc.)
from Lorenzo Tozzi (oni_niubbo) with minor edits.
--- from the tracker
The present situation is this: due to bug#22274, during editing, UTF chars are stripped from buttons with a unit associated
(length, angles, etc.).
Example: if the button displays '90°' and you click on it with LMB, the editing string will become '90'.
The problem arises if you use microns: '34µm' becomes '34' that blender interprets as 34 meters. So clicking on a button
and hitting enter won't confirm the previous value, but will change it (very badly also).
Of course nobody is using microns in blender, but the problem will arise when we will implement areas and option 'Separate
Units' will be enabled. The value '2m² 3cm²' will become '2m' during editing.
This patch solves the problem rewriting the string in a smarter way than just stripping the UTF chars: the unit is translated
from unit->name_short ('µm') to unit->name_alt ('um'). So clicking on '34µm' the editing string will become
'34um'.
--- end
note: rather then allowing empty strings in name_alt field I made it so if the unit system was the default one a NULL name_alt will just strip the string, since its the default its not needed.
2010-09-15 17:37:00 +00:00
|
|
|
char *orig_str;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2014-06-14 18:32:18 +10:00
|
|
|
orig_str = BLI_strdup(str);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2012-04-10 09:03:45 +00:00
|
|
|
bUnit_ToUnitAltName(str, maxlen, orig_str, unit->system, RNA_SUBTYPE_UNIT_VALUE(unit_type));
|
2018-05-23 10:47:12 +02:00
|
|
|
|
patch [#23758] Better handling of UTF chars in UNITS fields (lengths, angles, etc.)
from Lorenzo Tozzi (oni_niubbo) with minor edits.
--- from the tracker
The present situation is this: due to bug#22274, during editing, UTF chars are stripped from buttons with a unit associated
(length, angles, etc.).
Example: if the button displays '90°' and you click on it with LMB, the editing string will become '90'.
The problem arises if you use microns: '34µm' becomes '34' that blender interprets as 34 meters. So clicking on a button
and hitting enter won't confirm the previous value, but will change it (very badly also).
Of course nobody is using microns in blender, but the problem will arise when we will implement areas and option 'Separate
Units' will be enabled. The value '2m² 3cm²' will become '2m' during editing.
This patch solves the problem rewriting the string in a smarter way than just stripping the UTF chars: the unit is translated
from unit->name_short ('µm') to unit->name_alt ('um'). So clicking on '34µm' the editing string will become
'34um'.
--- end
note: rather then allowing empty strings in name_alt field I made it so if the unit system was the default one a NULL name_alt will just strip the string, since its the default its not needed.
2010-09-15 17:37:00 +00:00
|
|
|
MEM_freeN(orig_str);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2013-03-04 04:21:51 +00:00
|
|
|
/**
|
2018-12-12 12:50:58 +11:00
|
|
|
* \param float_precision: Override the button precision.
|
2013-03-04 04:21:51 +00:00
|
|
|
*/
|
2014-01-04 17:16:19 +11:00
|
|
|
static void ui_get_but_string_unit(uiBut *but, char *str, int len_max, double value, bool pad, int float_precision)
|
added time units, currently only used when metric or imperial are enabled.
long/short units...
day,d, hour,hr,h, minute,min,m, second,sec,s, millisecond,ms, microsecond,us
Also may fix some bugs that were reported.
Note, to convert fps to time evil_C needs to be used to get the scene.
2009-08-12 05:20:16 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
UnitSettings *unit = but->block->unit;
|
2014-11-09 21:20:40 +01:00
|
|
|
int unit_type = UI_but_unit_type_get(but);
|
2013-03-04 04:21:51 +00:00
|
|
|
int precision;
|
added time units, currently only used when metric or imperial are enabled.
long/short units...
day,d, hour,hr,h, minute,min,m, second,sec,s, millisecond,ms, microsecond,us
Also may fix some bugs that were reported.
Note, to convert fps to time evil_C needs to be used to get the scene.
2009-08-12 05:20:16 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
if (unit->scale_length < 0.0001f) unit->scale_length = 1.0f; // XXX do_versions
|
2009-08-13 07:37:41 +00:00
|
|
|
|
2013-03-04 04:21:51 +00:00
|
|
|
/* Use precision override? */
|
|
|
|
if (float_precision == -1) {
|
|
|
|
/* Sanity checks */
|
|
|
|
precision = (int)but->a2;
|
2014-01-08 17:04:10 +01:00
|
|
|
if (precision > UI_PRECISION_FLOAT_MAX) precision = UI_PRECISION_FLOAT_MAX;
|
2014-04-20 22:05:05 +10:00
|
|
|
else if (precision == -1) precision = 2;
|
2013-03-04 04:21:51 +00:00
|
|
|
}
|
|
|
|
else {
|
|
|
|
precision = float_precision;
|
|
|
|
}
|
added time units, currently only used when metric or imperial are enabled.
long/short units...
day,d, hour,hr,h, minute,min,m, second,sec,s, millisecond,ms, microsecond,us
Also may fix some bugs that were reported.
Note, to convert fps to time evil_C needs to be used to get the scene.
2009-08-12 05:20:16 +00:00
|
|
|
|
2018-10-03 10:20:16 +02:00
|
|
|
bUnit_AsString2(
|
2018-07-01 19:57:31 +02:00
|
|
|
str, len_max, ui_get_but_scale_unit(but, value), precision,
|
2018-10-03 10:20:16 +02:00
|
|
|
RNA_SUBTYPE_UNIT_VALUE(unit_type), unit, pad);
|
added time units, currently only used when metric or imperial are enabled.
long/short units...
day,d, hour,hr,h, minute,min,m, second,sec,s, millisecond,ms, microsecond,us
Also may fix some bugs that were reported.
Note, to convert fps to time evil_C needs to be used to get the scene.
2009-08-12 05:20:16 +00:00
|
|
|
}
|
|
|
|
|
2010-07-24 00:24:58 +00:00
|
|
|
static float ui_get_but_step_unit(uiBut *but, float step_default)
|
2009-08-12 08:16:10 +00:00
|
|
|
{
|
2014-11-09 21:20:40 +01:00
|
|
|
int unit_type = RNA_SUBTYPE_UNIT_VALUE(UI_but_unit_type_get(but));
|
2015-09-17 21:50:40 +10:00
|
|
|
const double step_orig = step_default * UI_PRECISION_FLOAT_SCALE;
|
2019-01-15 23:24:20 +11:00
|
|
|
/* Scaling up 'step_origg ' here is a bit arbitrary,
|
|
|
|
* its just giving better scales from user POV */
|
2015-09-17 21:50:40 +10:00
|
|
|
const double scale_step = ui_get_but_scale_unit(but, step_orig * 10);
|
|
|
|
const double step = bUnit_ClosestScalar(scale_step, but->block->unit->system, unit_type);
|
2009-08-12 08:16:10 +00:00
|
|
|
|
2013-09-02 22:28:18 +00:00
|
|
|
/* -1 is an error value */
|
|
|
|
if (step != -1.0) {
|
2015-09-17 21:50:40 +10:00
|
|
|
const double scale_unit = ui_get_but_scale_unit(but, 1.0);
|
|
|
|
const double step_unit = bUnit_ClosestScalar(scale_unit, but->block->unit->system, unit_type);
|
|
|
|
double step_final;
|
|
|
|
|
2013-09-02 22:28:18 +00:00
|
|
|
BLI_assert(step > 0.0);
|
2015-09-17 21:50:40 +10:00
|
|
|
|
2015-10-17 16:06:45 +11:00
|
|
|
step_final = (step / scale_unit) / (double)UI_PRECISION_FLOAT_SCALE;
|
2015-09-17 21:50:40 +10:00
|
|
|
|
|
|
|
if (step == step_unit) {
|
|
|
|
/* Logic here is to scale by the original 'step_orig'
|
|
|
|
* only when the unit step matches the scaled step.
|
|
|
|
*
|
|
|
|
* This is needed for units that don't have a wide range of scales (degrees for eg.).
|
|
|
|
* Without this we can't select between a single degree, or a 10th of a degree.
|
|
|
|
*/
|
|
|
|
step_final *= step_orig;
|
|
|
|
}
|
|
|
|
|
|
|
|
return (float)step_final;
|
2009-08-12 08:16:10 +00:00
|
|
|
}
|
|
|
|
else {
|
|
|
|
return step_default;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2013-03-04 04:21:51 +00:00
|
|
|
/**
|
2018-12-12 12:50:58 +11:00
|
|
|
* \param float_precision: For number buttons the precision to use or -1 to fallback to the button default.
|
|
|
|
* \param use_exp_float: Use exponent representation of floats when out of reasonable range (outside of 1e3/1e-3).
|
2013-03-04 04:21:51 +00:00
|
|
|
*/
|
2017-07-17 18:22:12 +02:00
|
|
|
void ui_but_string_get_ex(uiBut *but, char *str, const size_t maxlen, const int float_precision, const bool use_exp_float, bool *r_use_exp_float)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2017-07-17 18:22:12 +02:00
|
|
|
if (r_use_exp_float) {
|
|
|
|
*r_use_exp_float = false;
|
|
|
|
}
|
|
|
|
|
UI: New Global Top-Bar (WIP)
== Main Features/Changes for Users
* Add horizontal bar at top of all non-temp windows, consisting out of two horizontal sub-bars.
* Upper sub-bar contains global menus (File, Render, etc.), tabs for workspaces and scene selector.
* Lower sub-bar contains object mode selector, screen-layout and render-layer selector. Later operator and/or tool settings will be placed here.
* Individual sections of the topbar are individually scrollable.
* Workspace tabs can be double- or ctrl-clicked for renaming and contain 'x' icon for deleting.
* Top-bar should scale nicely with DPI.
* The lower half of the top-bar can be hided by dragging the lower top-bar edge up. Better hiding options are planned (e.g. hide in fullscreen modes).
* Info editors at the top of the window and using the full window width with be replaced by the top-bar.
* In fullscreen modes, no more info editor is added on top, the top-bar replaces it.
== Technical Features/Changes
* Adds initial support for global areas
A global area is part of the window, not part of the regular screen-layout.
I've added a macro iterator to iterate over both, global and screen-layout level areas. When iterating over areas, from now on developers should always consider if they have to include global areas.
* Adds a TOPBAR editor type
The editor type is hidden in the UI editor type menu.
* Adds a variation of the ID template to display IDs as tab buttons (template_ID_tabs in BPY)
* Does various changes to RNA button creation code to improve their appearance in the horizontal top-bar.
* Adds support for dynamically sized regions. That is, regions that scale automatically to the layout bounds.
The code for this is currently a big hack (it's based on drawing the UI multiple times). This should definitely be improved.
* Adds a template for displaying operator properties optimized for the top-bar. This will probably change a lot still and is in fact disabled in code.
Since the final top-bar design depends a lot on other 2.8 designs (mainly tool-system and workspaces), we decided to not show the operator or tool settings in the top-bar for now. That means most of the lower sub-bar is empty for the time being.
NOTE: Top-bar or global area data is not written to files or SDNA. They are simply added to the window when opening Blender or reading a file. This allows us doing changes to the top-bar without having to care for compatibility.
== ToDo's
It's a bit hard to predict all the ToDo's here are the known main ones:
* Add options for the new active-tool system and for operator redo to the topbar.
* Automatically hide the top-bar in fullscreen modes.
* General visual polish.
* Top-bar drag & drop support (WIP in temp-tab_drag_drop).
* Improve dynamic regions (should also fix some layout glitches).
* Make internal terminology consistent.
* Enable topbar file writing once design is more advanced.
* Address TODO's and XXX's in code :)
Thanks @brecht for the review! And @sergey for the complaining ;)
Differential Revision: D2758
2018-04-20 17:14:03 +02:00
|
|
|
if (but->rnaprop && ELEM(but->type, UI_BTYPE_TEXT, UI_BTYPE_SEARCH_MENU, UI_BTYPE_TAB)) {
|
2009-03-30 17:31:37 +00:00
|
|
|
PropertyType type;
|
2012-11-24 05:15:48 +00:00
|
|
|
const char *buf = NULL;
|
2011-10-22 10:49:35 +00:00
|
|
|
int buf_len;
|
2009-03-30 17:31:37 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
type = RNA_property_type(but->rnaprop);
|
2009-03-30 17:31:37 +00:00
|
|
|
|
UI: New Global Top-Bar (WIP)
== Main Features/Changes for Users
* Add horizontal bar at top of all non-temp windows, consisting out of two horizontal sub-bars.
* Upper sub-bar contains global menus (File, Render, etc.), tabs for workspaces and scene selector.
* Lower sub-bar contains object mode selector, screen-layout and render-layer selector. Later operator and/or tool settings will be placed here.
* Individual sections of the topbar are individually scrollable.
* Workspace tabs can be double- or ctrl-clicked for renaming and contain 'x' icon for deleting.
* Top-bar should scale nicely with DPI.
* The lower half of the top-bar can be hided by dragging the lower top-bar edge up. Better hiding options are planned (e.g. hide in fullscreen modes).
* Info editors at the top of the window and using the full window width with be replaced by the top-bar.
* In fullscreen modes, no more info editor is added on top, the top-bar replaces it.
== Technical Features/Changes
* Adds initial support for global areas
A global area is part of the window, not part of the regular screen-layout.
I've added a macro iterator to iterate over both, global and screen-layout level areas. When iterating over areas, from now on developers should always consider if they have to include global areas.
* Adds a TOPBAR editor type
The editor type is hidden in the UI editor type menu.
* Adds a variation of the ID template to display IDs as tab buttons (template_ID_tabs in BPY)
* Does various changes to RNA button creation code to improve their appearance in the horizontal top-bar.
* Adds support for dynamically sized regions. That is, regions that scale automatically to the layout bounds.
The code for this is currently a big hack (it's based on drawing the UI multiple times). This should definitely be improved.
* Adds a template for displaying operator properties optimized for the top-bar. This will probably change a lot still and is in fact disabled in code.
Since the final top-bar design depends a lot on other 2.8 designs (mainly tool-system and workspaces), we decided to not show the operator or tool settings in the top-bar for now. That means most of the lower sub-bar is empty for the time being.
NOTE: Top-bar or global area data is not written to files or SDNA. They are simply added to the window when opening Blender or reading a file. This allows us doing changes to the top-bar without having to care for compatibility.
== ToDo's
It's a bit hard to predict all the ToDo's here are the known main ones:
* Add options for the new active-tool system and for operator redo to the topbar.
* Automatically hide the top-bar in fullscreen modes.
* General visual polish.
* Top-bar drag & drop support (WIP in temp-tab_drag_drop).
* Improve dynamic regions (should also fix some layout glitches).
* Make internal terminology consistent.
* Enable topbar file writing once design is more advanced.
* Address TODO's and XXX's in code :)
Thanks @brecht for the review! And @sergey for the complaining ;)
Differential Revision: D2758
2018-04-20 17:14:03 +02:00
|
|
|
if ((but->type == UI_BTYPE_TAB) && (but->custom_data)) {
|
|
|
|
StructRNA *ptr_type = RNA_property_pointer_type(&but->rnapoin, but->rnaprop);
|
|
|
|
PointerRNA ptr;
|
|
|
|
|
|
|
|
/* uiBut.custom_data points to data this tab represents (e.g. workspace).
|
|
|
|
* uiBut.rnapoin/prop store an active value (e.g. active workspace). */
|
|
|
|
RNA_pointer_create(but->rnapoin.id.data, ptr_type, but->custom_data, &ptr);
|
|
|
|
buf = RNA_struct_name_get_alloc(&ptr, str, maxlen, &buf_len);
|
|
|
|
}
|
|
|
|
else if (type == PROP_STRING) {
|
2009-03-30 17:31:37 +00:00
|
|
|
/* RNA string */
|
2012-03-30 01:51:25 +00:00
|
|
|
buf = RNA_property_string_get_alloc(&but->rnapoin, but->rnaprop, str, maxlen, &buf_len);
|
2009-03-30 17:31:37 +00:00
|
|
|
}
|
2012-11-24 05:15:48 +00:00
|
|
|
else if (type == PROP_ENUM) {
|
|
|
|
/* RNA enum */
|
|
|
|
int value = RNA_property_enum_get(&but->rnapoin, but->rnaprop);
|
|
|
|
if (RNA_property_enum_name(but->block->evil_C, &but->rnapoin, but->rnaprop, value, &buf)) {
|
|
|
|
BLI_strncpy(str, buf, maxlen);
|
|
|
|
buf = str;
|
|
|
|
}
|
|
|
|
}
|
2012-03-24 06:38:07 +00:00
|
|
|
else if (type == PROP_POINTER) {
|
2009-03-30 17:31:37 +00:00
|
|
|
/* RNA pointer */
|
2012-03-30 01:51:25 +00:00
|
|
|
PointerRNA ptr = RNA_property_pointer_get(&but->rnapoin, but->rnaprop);
|
|
|
|
buf = RNA_struct_name_get_alloc(&ptr, str, maxlen, &buf_len);
|
2009-03-30 17:31:37 +00:00
|
|
|
}
|
2012-11-24 05:15:48 +00:00
|
|
|
else {
|
|
|
|
BLI_assert(0);
|
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (!buf) {
|
2010-02-08 19:24:13 +00:00
|
|
|
str[0] = '\0';
|
2009-06-24 14:03:55 +00:00
|
|
|
}
|
2012-03-24 06:38:07 +00:00
|
|
|
else if (buf && buf != str) {
|
2015-09-18 16:30:47 +10:00
|
|
|
BLI_assert(maxlen <= buf_len + 1);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
/* string was too long, we have to truncate */
|
2015-09-18 16:30:47 +10:00
|
|
|
if (ui_but_is_utf8(but)) {
|
|
|
|
BLI_strncpy_utf8(str, buf, maxlen);
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
BLI_strncpy(str, buf, maxlen);
|
|
|
|
}
|
2012-11-24 05:15:48 +00:00
|
|
|
MEM_freeN((void *)buf);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
}
|
UI: New Global Top-Bar (WIP)
== Main Features/Changes for Users
* Add horizontal bar at top of all non-temp windows, consisting out of two horizontal sub-bars.
* Upper sub-bar contains global menus (File, Render, etc.), tabs for workspaces and scene selector.
* Lower sub-bar contains object mode selector, screen-layout and render-layer selector. Later operator and/or tool settings will be placed here.
* Individual sections of the topbar are individually scrollable.
* Workspace tabs can be double- or ctrl-clicked for renaming and contain 'x' icon for deleting.
* Top-bar should scale nicely with DPI.
* The lower half of the top-bar can be hided by dragging the lower top-bar edge up. Better hiding options are planned (e.g. hide in fullscreen modes).
* Info editors at the top of the window and using the full window width with be replaced by the top-bar.
* In fullscreen modes, no more info editor is added on top, the top-bar replaces it.
== Technical Features/Changes
* Adds initial support for global areas
A global area is part of the window, not part of the regular screen-layout.
I've added a macro iterator to iterate over both, global and screen-layout level areas. When iterating over areas, from now on developers should always consider if they have to include global areas.
* Adds a TOPBAR editor type
The editor type is hidden in the UI editor type menu.
* Adds a variation of the ID template to display IDs as tab buttons (template_ID_tabs in BPY)
* Does various changes to RNA button creation code to improve their appearance in the horizontal top-bar.
* Adds support for dynamically sized regions. That is, regions that scale automatically to the layout bounds.
The code for this is currently a big hack (it's based on drawing the UI multiple times). This should definitely be improved.
* Adds a template for displaying operator properties optimized for the top-bar. This will probably change a lot still and is in fact disabled in code.
Since the final top-bar design depends a lot on other 2.8 designs (mainly tool-system and workspaces), we decided to not show the operator or tool settings in the top-bar for now. That means most of the lower sub-bar is empty for the time being.
NOTE: Top-bar or global area data is not written to files or SDNA. They are simply added to the window when opening Blender or reading a file. This allows us doing changes to the top-bar without having to care for compatibility.
== ToDo's
It's a bit hard to predict all the ToDo's here are the known main ones:
* Add options for the new active-tool system and for operator redo to the topbar.
* Automatically hide the top-bar in fullscreen modes.
* General visual polish.
* Top-bar drag & drop support (WIP in temp-tab_drag_drop).
* Improve dynamic regions (should also fix some layout glitches).
* Make internal terminology consistent.
* Enable topbar file writing once design is more advanced.
* Address TODO's and XXX's in code :)
Thanks @brecht for the review! And @sergey for the complaining ;)
Differential Revision: D2758
2018-04-20 17:14:03 +02:00
|
|
|
else if (ELEM(but->type, UI_BTYPE_TEXT, UI_BTYPE_SEARCH_MENU)) {
|
2009-06-02 18:10:06 +00:00
|
|
|
/* string */
|
|
|
|
BLI_strncpy(str, but->poin, maxlen);
|
|
|
|
return;
|
|
|
|
}
|
2012-10-14 13:08:19 +00:00
|
|
|
else if (ui_but_anim_expression_get(but, str, maxlen)) {
|
|
|
|
/* driver expression */
|
|
|
|
}
|
2009-03-30 17:31:37 +00:00
|
|
|
else {
|
2009-07-12 02:06:15 +00:00
|
|
|
/* number editing */
|
2009-03-30 17:31:37 +00:00
|
|
|
double value;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
value = ui_but_value_get(but);
|
2009-03-30 17:31:37 +00:00
|
|
|
|
2019-03-13 16:58:00 +01:00
|
|
|
PropertySubType subtype = PROP_NONE;
|
|
|
|
if (but->rnaprop) {
|
|
|
|
subtype = RNA_property_subtype(but->rnaprop);
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
if (ui_but_is_float(but)) {
|
2019-03-13 16:58:00 +01:00
|
|
|
int prec = (float_precision == -1) ? ui_but_calc_float_precision(but, value) : float_precision;
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
if (ui_but_is_unit(but)) {
|
2019-03-13 16:58:00 +01:00
|
|
|
ui_get_but_string_unit(but, str, maxlen, value, false, prec);
|
|
|
|
}
|
|
|
|
else if (subtype == PROP_FACTOR) {
|
|
|
|
if (U.factor_display_type == USER_FACTOR_AS_FACTOR) {
|
|
|
|
BLI_snprintf(str, maxlen, "%.*f", prec, value);
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
BLI_snprintf(str, maxlen, "%.*f", MAX2(0, prec - 2), value * 100);
|
|
|
|
}
|
|
|
|
|
user interface units, off by default.
- currently only distances work.
- user preferences, edit section to set the units and scale.
- option to display pairs (nicer for imperial display?)
- support for evaluating multiple comma separated values eg: 2',11" ..or.. 5ft, 4mil
- comma separated expressions/values accumulate 1+1,2**3,4cm/3
- attempted fast conversion from a value to a string so button drawing isn't too slow.
* imperial long/short *
- mile, mi
- yard, yd
- foot, '
- inch, "
- thou, mil
* metric long/short *
kilometer, km
meter, m
centimeter, cm
millimeter, mm
micrometer, um
nanometer, nm
picometer, pm
2009-08-11 18:53:01 +00:00
|
|
|
}
|
2011-03-29 14:36:55 +00:00
|
|
|
else {
|
2017-07-17 18:22:12 +02:00
|
|
|
if (use_exp_float) {
|
2017-08-01 16:35:07 +02:00
|
|
|
const int int_digits_num = integer_digits_f(value);
|
|
|
|
if (int_digits_num < -6 || int_digits_num > 12) {
|
2017-07-17 18:22:12 +02:00
|
|
|
BLI_snprintf(str, maxlen, "%.*g", prec, value);
|
|
|
|
if (r_use_exp_float) {
|
|
|
|
*r_use_exp_float = true;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
else {
|
2017-08-01 16:35:07 +02:00
|
|
|
prec -= int_digits_num;
|
2017-07-31 15:40:26 +02:00
|
|
|
CLAMP(prec, 0, UI_PRECISION_FLOAT_MAX);
|
|
|
|
BLI_snprintf(str, maxlen, "%.*f", prec, value);
|
2017-07-17 18:22:12 +02:00
|
|
|
}
|
|
|
|
}
|
|
|
|
else {
|
2017-08-01 16:35:07 +02:00
|
|
|
#if 0 /* TODO, but will likely break some stuff, so better after 2.79 release. */
|
|
|
|
prec -= int_digits_num;
|
|
|
|
CLAMP(prec, 0, UI_PRECISION_FLOAT_MAX);
|
|
|
|
#endif
|
2017-07-17 18:22:12 +02:00
|
|
|
BLI_snprintf(str, maxlen, "%.*f", prec, value);
|
|
|
|
}
|
2009-03-30 17:31:37 +00:00
|
|
|
}
|
|
|
|
}
|
2017-07-17 18:22:12 +02:00
|
|
|
else {
|
2009-03-30 17:31:37 +00:00
|
|
|
BLI_snprintf(str, maxlen, "%d", (int)value);
|
2017-07-17 18:22:12 +02:00
|
|
|
}
|
2009-03-30 17:31:37 +00:00
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2014-11-09 21:20:40 +01:00
|
|
|
void ui_but_string_get(uiBut *but, char *str, const size_t maxlen)
|
2013-03-04 04:21:51 +00:00
|
|
|
{
|
2017-07-17 18:22:12 +02:00
|
|
|
ui_but_string_get_ex(but, str, maxlen, -1, false, NULL);
|
2013-03-04 04:21:51 +00:00
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2015-11-10 08:35:12 +11:00
|
|
|
/**
|
|
|
|
* A version of #ui_but_string_get_ex for dynamic buffer sizes
|
|
|
|
* (where #ui_but_string_get_max_length returns 0).
|
|
|
|
*
|
|
|
|
* \param r_str_size: size of the returned string (including terminator).
|
|
|
|
*/
|
|
|
|
char *ui_but_string_get_dynamic(uiBut *but, int *r_str_size)
|
|
|
|
{
|
|
|
|
char *str = NULL;
|
|
|
|
*r_str_size = 1;
|
|
|
|
|
|
|
|
if (but->rnaprop && ELEM(but->type, UI_BTYPE_TEXT, UI_BTYPE_SEARCH_MENU)) {
|
|
|
|
PropertyType type;
|
|
|
|
|
|
|
|
type = RNA_property_type(but->rnaprop);
|
|
|
|
|
|
|
|
if (type == PROP_STRING) {
|
|
|
|
/* RNA string */
|
|
|
|
str = RNA_property_string_get_alloc(&but->rnapoin, but->rnaprop, NULL, 0, r_str_size);
|
|
|
|
(*r_str_size) += 1;
|
|
|
|
}
|
|
|
|
else if (type == PROP_ENUM) {
|
|
|
|
/* RNA enum */
|
|
|
|
int value = RNA_property_enum_get(&but->rnapoin, but->rnaprop);
|
|
|
|
const char *value_id;
|
|
|
|
if (!RNA_property_enum_name(but->block->evil_C, &but->rnapoin, but->rnaprop, value, &value_id)) {
|
|
|
|
value_id = "";
|
|
|
|
}
|
|
|
|
|
|
|
|
*r_str_size = strlen(value_id) + 1;
|
|
|
|
str = BLI_strdupn(value_id, *r_str_size);
|
|
|
|
}
|
|
|
|
else if (type == PROP_POINTER) {
|
|
|
|
/* RNA pointer */
|
|
|
|
PointerRNA ptr = RNA_property_pointer_get(&but->rnapoin, but->rnaprop);
|
|
|
|
str = RNA_struct_name_get_alloc(&ptr, NULL, 0, r_str_size);
|
|
|
|
(*r_str_size) += 1;
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
BLI_assert(0);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
BLI_assert(0);
|
|
|
|
}
|
|
|
|
|
|
|
|
if (UNLIKELY(str == NULL)) {
|
|
|
|
/* should never happen, paranoid check */
|
|
|
|
*r_str_size = 1;
|
|
|
|
str = BLI_strdup("");
|
|
|
|
BLI_assert(0);
|
|
|
|
|
|
|
|
}
|
|
|
|
|
|
|
|
return str;
|
|
|
|
}
|
|
|
|
|
2017-03-18 09:39:36 +11:00
|
|
|
static bool ui_set_but_string_eval_num_unit(bContext *C, uiBut *but, const char *str, double *r_value)
|
2011-04-18 03:27:15 +00:00
|
|
|
{
|
2018-10-03 10:20:16 +02:00
|
|
|
const UnitSettings *unit = but->block->unit;
|
|
|
|
int type = RNA_SUBTYPE_UNIT_VALUE(UI_but_unit_type_get(but));
|
|
|
|
return user_string_to_number(C, str, unit, type, r_value);
|
2011-04-18 03:27:15 +00:00
|
|
|
}
|
|
|
|
|
2019-03-13 16:58:00 +01:00
|
|
|
static bool ui_number_from_string(bContext *C, const char *str, double *r_value)
|
2011-04-18 03:27:15 +00:00
|
|
|
{
|
2012-01-20 22:32:47 +00:00
|
|
|
#ifdef WITH_PYTHON
|
2019-03-13 16:58:00 +01:00
|
|
|
return BPY_execute_string_as_number(C, NULL, str, true, r_value);
|
|
|
|
#else
|
|
|
|
*r_value = atof(str);
|
|
|
|
return true;
|
|
|
|
#endif
|
|
|
|
}
|
2012-01-20 22:32:47 +00:00
|
|
|
|
2019-03-13 16:58:00 +01:00
|
|
|
static bool ui_number_from_string_factor(bContext *C, const char *str, double *r_value)
|
|
|
|
{
|
|
|
|
int len = strlen(str);
|
|
|
|
if (BLI_strn_endswith(str, "%", len)) {
|
|
|
|
char *str_new = BLI_strdupn(str, len - 1);
|
|
|
|
bool success = ui_number_from_string(C, str_new, r_value);
|
|
|
|
MEM_freeN(str_new);
|
|
|
|
*r_value /= 100.0;
|
|
|
|
return success;
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
if (!ui_number_from_string(C, str, r_value)) {
|
|
|
|
return false;
|
2011-04-18 03:27:15 +00:00
|
|
|
}
|
2019-03-13 16:58:00 +01:00
|
|
|
if (U.factor_display_type == USER_FACTOR_AS_PERCENTAGE) {
|
|
|
|
*r_value /= 100.0;
|
2011-04-18 03:27:15 +00:00
|
|
|
}
|
2019-03-13 16:58:00 +01:00
|
|
|
return true;
|
2011-04-18 03:27:15 +00:00
|
|
|
}
|
2019-03-13 16:58:00 +01:00
|
|
|
}
|
2011-04-18 03:27:15 +00:00
|
|
|
|
2019-03-13 16:58:00 +01:00
|
|
|
static bool ui_number_from_string_percentage(bContext *C, const char *str, double *r_value)
|
|
|
|
{
|
|
|
|
int len = strlen(str);
|
|
|
|
if (BLI_strn_endswith(str, "%", len)) {
|
|
|
|
char *str_new = BLI_strdupn(str, len - 1);
|
|
|
|
bool success = ui_number_from_string(C, str_new, r_value);
|
|
|
|
MEM_freeN(str_new);
|
|
|
|
return success;
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
return ui_number_from_string(C, str, r_value);
|
|
|
|
}
|
|
|
|
}
|
2012-01-20 22:32:47 +00:00
|
|
|
|
2019-03-13 16:58:00 +01:00
|
|
|
bool ui_but_string_set_eval_num(bContext *C, uiBut *but, const char *str, double *r_value)
|
|
|
|
{
|
|
|
|
if (str[0] == '\0') {
|
|
|
|
*r_value = 0.0;
|
|
|
|
return true;
|
|
|
|
}
|
2012-03-08 03:25:53 +00:00
|
|
|
|
2019-03-13 16:58:00 +01:00
|
|
|
PropertySubType subtype = PROP_NONE;
|
|
|
|
if (but->rnaprop) {
|
|
|
|
subtype = RNA_property_subtype(but->rnaprop);
|
|
|
|
}
|
2012-01-20 22:32:47 +00:00
|
|
|
|
2019-03-13 16:58:00 +01:00
|
|
|
if (ui_but_is_float(but)) {
|
|
|
|
if (ui_but_is_unit(but)) {
|
|
|
|
return ui_set_but_string_eval_num_unit(C, but, str, r_value);
|
|
|
|
}
|
|
|
|
else if (subtype == PROP_FACTOR) {
|
|
|
|
return ui_number_from_string_factor(C, str, r_value);
|
|
|
|
}
|
|
|
|
else if (subtype == PROP_PERCENTAGE) {
|
|
|
|
return ui_number_from_string_percentage(C, str, r_value);
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
return ui_number_from_string(C, str, r_value);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
return ui_number_from_string(C, str, r_value);
|
|
|
|
}
|
2011-04-18 03:27:15 +00:00
|
|
|
}
|
|
|
|
|
2015-01-06 11:05:08 +11:00
|
|
|
/* just the assignment/free part */
|
|
|
|
static void ui_but_string_set_internal(uiBut *but, const char *str, size_t str_len)
|
|
|
|
{
|
|
|
|
BLI_assert(str_len == strlen(str));
|
|
|
|
BLI_assert(but->str == NULL);
|
|
|
|
str_len += 1;
|
|
|
|
|
|
|
|
if (str_len > UI_MAX_NAME_STR) {
|
|
|
|
but->str = MEM_mallocN(str_len, "ui_def_but str");
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
but->str = but->strdata;
|
|
|
|
}
|
|
|
|
memcpy(but->str, str, str_len);
|
|
|
|
}
|
|
|
|
|
|
|
|
static void ui_but_string_free_internal(uiBut *but)
|
|
|
|
{
|
|
|
|
if (but->str) {
|
|
|
|
if (but->str != but->strdata) {
|
|
|
|
MEM_freeN(but->str);
|
|
|
|
}
|
|
|
|
/* must call 'ui_but_string_set_internal' after */
|
|
|
|
but->str = NULL;
|
|
|
|
}
|
|
|
|
}
|
2011-04-18 03:27:15 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
bool ui_but_string_set(bContext *C, uiBut *but, const char *str)
|
2009-03-30 17:31:37 +00:00
|
|
|
{
|
2017-05-12 01:42:42 +02:00
|
|
|
if (but->rnaprop && but->rnapoin.data && ELEM(but->type, UI_BTYPE_TEXT, UI_BTYPE_SEARCH_MENU)) {
|
2012-03-24 06:38:07 +00:00
|
|
|
if (RNA_property_editable(&but->rnapoin, but->rnaprop)) {
|
2009-03-30 17:31:37 +00:00
|
|
|
PropertyType type;
|
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
type = RNA_property_type(but->rnaprop);
|
2009-03-30 17:31:37 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (type == PROP_STRING) {
|
2009-03-30 17:31:37 +00:00
|
|
|
/* RNA string */
|
|
|
|
RNA_property_string_set(&but->rnapoin, but->rnaprop, str);
|
2013-04-04 02:05:11 +00:00
|
|
|
return true;
|
2009-03-30 17:31:37 +00:00
|
|
|
}
|
2012-03-24 06:38:07 +00:00
|
|
|
else if (type == PROP_POINTER) {
|
2014-01-20 11:13:53 +11:00
|
|
|
if (str[0] == '\0') {
|
2009-06-27 01:15:31 +00:00
|
|
|
RNA_property_pointer_set(&but->rnapoin, but->rnaprop, PointerRNA_NULL);
|
2013-04-04 02:05:11 +00:00
|
|
|
return true;
|
2009-05-27 00:03:49 +00:00
|
|
|
}
|
2009-06-27 01:15:31 +00:00
|
|
|
else {
|
UI: New Global Top-Bar (WIP)
== Main Features/Changes for Users
* Add horizontal bar at top of all non-temp windows, consisting out of two horizontal sub-bars.
* Upper sub-bar contains global menus (File, Render, etc.), tabs for workspaces and scene selector.
* Lower sub-bar contains object mode selector, screen-layout and render-layer selector. Later operator and/or tool settings will be placed here.
* Individual sections of the topbar are individually scrollable.
* Workspace tabs can be double- or ctrl-clicked for renaming and contain 'x' icon for deleting.
* Top-bar should scale nicely with DPI.
* The lower half of the top-bar can be hided by dragging the lower top-bar edge up. Better hiding options are planned (e.g. hide in fullscreen modes).
* Info editors at the top of the window and using the full window width with be replaced by the top-bar.
* In fullscreen modes, no more info editor is added on top, the top-bar replaces it.
== Technical Features/Changes
* Adds initial support for global areas
A global area is part of the window, not part of the regular screen-layout.
I've added a macro iterator to iterate over both, global and screen-layout level areas. When iterating over areas, from now on developers should always consider if they have to include global areas.
* Adds a TOPBAR editor type
The editor type is hidden in the UI editor type menu.
* Adds a variation of the ID template to display IDs as tab buttons (template_ID_tabs in BPY)
* Does various changes to RNA button creation code to improve their appearance in the horizontal top-bar.
* Adds support for dynamically sized regions. That is, regions that scale automatically to the layout bounds.
The code for this is currently a big hack (it's based on drawing the UI multiple times). This should definitely be improved.
* Adds a template for displaying operator properties optimized for the top-bar. This will probably change a lot still and is in fact disabled in code.
Since the final top-bar design depends a lot on other 2.8 designs (mainly tool-system and workspaces), we decided to not show the operator or tool settings in the top-bar for now. That means most of the lower sub-bar is empty for the time being.
NOTE: Top-bar or global area data is not written to files or SDNA. They are simply added to the window when opening Blender or reading a file. This allows us doing changes to the top-bar without having to care for compatibility.
== ToDo's
It's a bit hard to predict all the ToDo's here are the known main ones:
* Add options for the new active-tool system and for operator redo to the topbar.
* Automatically hide the top-bar in fullscreen modes.
* General visual polish.
* Top-bar drag & drop support (WIP in temp-tab_drag_drop).
* Improve dynamic regions (should also fix some layout glitches).
* Make internal terminology consistent.
* Enable topbar file writing once design is more advanced.
* Address TODO's and XXX's in code :)
Thanks @brecht for the review! And @sergey for the complaining ;)
Differential Revision: D2758
2018-04-20 17:14:03 +02:00
|
|
|
/* RNA pointer */
|
|
|
|
PointerRNA rptr;
|
|
|
|
PointerRNA ptr = but->rnasearchpoin;
|
|
|
|
PropertyRNA *prop = but->rnasearchprop;
|
|
|
|
|
2018-12-19 16:59:24 +01:00
|
|
|
/* This is kind of hackish, in theory think we could only ever use the second member of
|
|
|
|
* this if/else, since ui_searchbox_apply() is supposed to always set that pointer when
|
|
|
|
* we are storing pointers... But keeping str search first for now, to try to break as little as
|
|
|
|
* possible existing code. All this is band-aids anyway.
|
|
|
|
* Fact remains, using editstr as main 'reference' over whole search button thingy is utterly weak
|
|
|
|
* and should be redesigned imho, but that's not a simple task. */
|
|
|
|
if (prop && RNA_property_collection_lookup_string(&ptr, prop, str, &rptr)) {
|
2009-06-27 01:15:31 +00:00
|
|
|
RNA_property_pointer_set(&but->rnapoin, but->rnaprop, rptr);
|
2018-12-19 16:59:24 +01:00
|
|
|
}
|
|
|
|
else if (but->func_arg2 != NULL) {
|
|
|
|
RNA_pointer_create(NULL, RNA_property_pointer_type(&but->rnapoin, but->rnaprop), but->func_arg2, &rptr);
|
|
|
|
RNA_property_pointer_set(&but->rnapoin, but->rnaprop, rptr);
|
|
|
|
}
|
2009-06-27 01:15:31 +00:00
|
|
|
|
2013-04-04 02:05:11 +00:00
|
|
|
return true;
|
2009-03-30 17:31:37 +00:00
|
|
|
}
|
2009-06-27 01:15:31 +00:00
|
|
|
|
2013-04-04 02:05:11 +00:00
|
|
|
return false;
|
2009-03-30 17:31:37 +00:00
|
|
|
}
|
2012-11-24 05:15:48 +00:00
|
|
|
else if (type == PROP_ENUM) {
|
|
|
|
int value;
|
|
|
|
if (RNA_property_enum_value(but->block->evil_C, &but->rnapoin, but->rnaprop, str, &value)) {
|
|
|
|
RNA_property_enum_set(&but->rnapoin, but->rnaprop, value);
|
2013-04-04 02:05:11 +00:00
|
|
|
return true;
|
2012-11-24 05:15:48 +00:00
|
|
|
}
|
2013-04-04 02:05:11 +00:00
|
|
|
return false;
|
2012-11-24 05:15:48 +00:00
|
|
|
}
|
|
|
|
else {
|
|
|
|
BLI_assert(0);
|
|
|
|
}
|
2009-03-30 17:31:37 +00:00
|
|
|
}
|
|
|
|
}
|
UI: New Global Top-Bar (WIP)
== Main Features/Changes for Users
* Add horizontal bar at top of all non-temp windows, consisting out of two horizontal sub-bars.
* Upper sub-bar contains global menus (File, Render, etc.), tabs for workspaces and scene selector.
* Lower sub-bar contains object mode selector, screen-layout and render-layer selector. Later operator and/or tool settings will be placed here.
* Individual sections of the topbar are individually scrollable.
* Workspace tabs can be double- or ctrl-clicked for renaming and contain 'x' icon for deleting.
* Top-bar should scale nicely with DPI.
* The lower half of the top-bar can be hided by dragging the lower top-bar edge up. Better hiding options are planned (e.g. hide in fullscreen modes).
* Info editors at the top of the window and using the full window width with be replaced by the top-bar.
* In fullscreen modes, no more info editor is added on top, the top-bar replaces it.
== Technical Features/Changes
* Adds initial support for global areas
A global area is part of the window, not part of the regular screen-layout.
I've added a macro iterator to iterate over both, global and screen-layout level areas. When iterating over areas, from now on developers should always consider if they have to include global areas.
* Adds a TOPBAR editor type
The editor type is hidden in the UI editor type menu.
* Adds a variation of the ID template to display IDs as tab buttons (template_ID_tabs in BPY)
* Does various changes to RNA button creation code to improve their appearance in the horizontal top-bar.
* Adds support for dynamically sized regions. That is, regions that scale automatically to the layout bounds.
The code for this is currently a big hack (it's based on drawing the UI multiple times). This should definitely be improved.
* Adds a template for displaying operator properties optimized for the top-bar. This will probably change a lot still and is in fact disabled in code.
Since the final top-bar design depends a lot on other 2.8 designs (mainly tool-system and workspaces), we decided to not show the operator or tool settings in the top-bar for now. That means most of the lower sub-bar is empty for the time being.
NOTE: Top-bar or global area data is not written to files or SDNA. They are simply added to the window when opening Blender or reading a file. This allows us doing changes to the top-bar without having to care for compatibility.
== ToDo's
It's a bit hard to predict all the ToDo's here are the known main ones:
* Add options for the new active-tool system and for operator redo to the topbar.
* Automatically hide the top-bar in fullscreen modes.
* General visual polish.
* Top-bar drag & drop support (WIP in temp-tab_drag_drop).
* Improve dynamic regions (should also fix some layout glitches).
* Make internal terminology consistent.
* Enable topbar file writing once design is more advanced.
* Address TODO's and XXX's in code :)
Thanks @brecht for the review! And @sergey for the complaining ;)
Differential Revision: D2758
2018-04-20 17:14:03 +02:00
|
|
|
else if (but->type == UI_BTYPE_TAB) {
|
|
|
|
if (but->rnaprop && but->custom_data) {
|
|
|
|
StructRNA *ptr_type = RNA_property_pointer_type(&but->rnapoin, but->rnaprop);
|
|
|
|
PointerRNA ptr;
|
|
|
|
PropertyRNA *prop;
|
|
|
|
|
|
|
|
/* uiBut.custom_data points to data this tab represents (e.g. workspace).
|
|
|
|
* uiBut.rnapoin/prop store an active value (e.g. active workspace). */
|
|
|
|
RNA_pointer_create(but->rnapoin.id.data, ptr_type, but->custom_data, &ptr);
|
|
|
|
prop = RNA_struct_name_property(ptr_type);
|
|
|
|
if (RNA_property_editable(&ptr, prop)) {
|
|
|
|
RNA_property_string_set(&ptr, prop, str);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
2014-11-09 21:20:40 +01:00
|
|
|
else if (but->type == UI_BTYPE_TEXT) {
|
2009-03-30 17:31:37 +00:00
|
|
|
/* string */
|
2019-02-19 17:24:10 -03:00
|
|
|
if (!but->poin) {
|
2017-05-12 01:42:42 +02:00
|
|
|
str = "";
|
|
|
|
}
|
|
|
|
else if (ui_but_is_utf8(but)) {
|
|
|
|
BLI_strncpy_utf8(but->poin, str, but->hardmax);
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
BLI_strncpy(but->poin, str, but->hardmax);
|
|
|
|
}
|
2011-10-20 07:12:14 +00:00
|
|
|
|
2013-04-04 02:05:11 +00:00
|
|
|
return true;
|
2009-03-30 17:31:37 +00:00
|
|
|
}
|
2015-03-24 15:05:27 +11:00
|
|
|
else if (but->type == UI_BTYPE_SEARCH_MENU) {
|
2009-06-02 18:10:06 +00:00
|
|
|
/* string */
|
|
|
|
BLI_strncpy(but->poin, str, but->hardmax);
|
2013-04-04 02:05:11 +00:00
|
|
|
return true;
|
2009-06-02 18:10:06 +00:00
|
|
|
}
|
2012-03-24 06:38:07 +00:00
|
|
|
else if (ui_but_anim_expression_set(but, str)) {
|
2009-07-12 02:06:15 +00:00
|
|
|
/* driver expression */
|
2013-04-04 02:05:11 +00:00
|
|
|
return true;
|
2009-07-12 02:06:15 +00:00
|
|
|
}
|
2012-03-30 01:51:25 +00:00
|
|
|
else if (str[0] == '#') {
|
First stages of easier "expressions" creation...
It is now possible to create "scripted expression" drivers by simply
clicking on some property, and typing some short Python expression
prefixed with a '#'. This will result in a scripted expression driver,
with the typed-in text being created.
For example, you can click on X-Location of the default cube, and
type:
#sin(frame)
and a new driver will be created for the x-location of the cube. This
will use the current frame value, and modulate this with a sine wave.
Do note though, that the current frame is a special case here. In the
current implementation, a special "frame" driver variable, which
references the current scene frame is created automatically, so that
this simple and (assumed) common case will work straight out of the
box.
Future improvements:
- Explore possibilities of semi-automated extraction of variables from
such expressions, resulting in automated variable extraction. (Doing
away with variables completely is definitely 100% off the agenda
though)
- Look into some ways of defining some shorthands for referencing
local data (possibly related to variable extraction?)
2011-07-04 03:12:28 +00:00
|
|
|
/* shortcut to create new driver expression (versus immediate Py-execution) */
|
2012-03-30 01:51:25 +00:00
|
|
|
return ui_but_anim_expression_create(but, str + 1);
|
First stages of easier "expressions" creation...
It is now possible to create "scripted expression" drivers by simply
clicking on some property, and typing some short Python expression
prefixed with a '#'. This will result in a scripted expression driver,
with the typed-in text being created.
For example, you can click on X-Location of the default cube, and
type:
#sin(frame)
and a new driver will be created for the x-location of the cube. This
will use the current frame value, and modulate this with a sine wave.
Do note though, that the current frame is a special case here. In the
current implementation, a special "frame" driver variable, which
references the current scene frame is created automatically, so that
this simple and (assumed) common case will work straight out of the
box.
Future improvements:
- Explore possibilities of semi-automated extraction of variables from
such expressions, resulting in automated variable extraction. (Doing
away with variables completely is definitely 100% off the agenda
though)
- Look into some ways of defining some shorthands for referencing
local data (possibly related to variable extraction?)
2011-07-04 03:12:28 +00:00
|
|
|
}
|
2009-03-30 17:31:37 +00:00
|
|
|
else {
|
2009-07-12 02:06:15 +00:00
|
|
|
/* number editing */
|
2009-03-30 17:31:37 +00:00
|
|
|
double value;
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
if (ui_but_string_set_eval_num(C, but, str, &value) == false) {
|
2016-01-16 21:51:05 +01:00
|
|
|
WM_report_banner_show();
|
2013-04-04 02:05:11 +00:00
|
|
|
return false;
|
2011-04-17 12:47:20 +00:00
|
|
|
}
|
2009-03-30 17:31:37 +00:00
|
|
|
|
2017-09-17 17:56:23 +10:00
|
|
|
if (!ui_but_is_float(but)) {
|
|
|
|
value = floor(value + 0.5);
|
|
|
|
}
|
2009-03-30 17:31:37 +00:00
|
|
|
|
|
|
|
/* not that we use hard limits here */
|
2012-03-30 01:51:25 +00:00
|
|
|
if (value < (double)but->hardmin) value = but->hardmin;
|
|
|
|
if (value > (double)but->hardmax) value = but->hardmax;
|
2009-03-30 17:31:37 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_value_set(but, value);
|
2013-04-04 02:05:11 +00:00
|
|
|
return true;
|
2009-03-30 17:31:37 +00:00
|
|
|
}
|
|
|
|
|
2013-04-04 02:05:11 +00:00
|
|
|
return false;
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void ui_but_default_set(bContext *C, const bool all, const bool use_afterfunc)
|
2010-01-20 04:19:55 +00:00
|
|
|
{
|
2014-11-16 11:09:51 +01:00
|
|
|
wmOperatorType *ot = WM_operatortype_find("UI_OT_reset_default_button", true);
|
2011-05-31 09:56:38 +00:00
|
|
|
|
2013-12-20 01:04:01 +11:00
|
|
|
if (use_afterfunc) {
|
|
|
|
PointerRNA *ptr;
|
|
|
|
ptr = ui_handle_afterfunc_add_operator(ot, WM_OP_EXEC_DEFAULT, true);
|
|
|
|
RNA_boolean_set(ptr, "all", all);
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
PointerRNA ptr;
|
2014-11-16 11:09:51 +01:00
|
|
|
WM_operator_properties_create_ptr(&ptr, ot);
|
2013-12-20 01:04:01 +11:00
|
|
|
RNA_boolean_set(&ptr, "all", all);
|
2014-11-16 11:09:51 +01:00
|
|
|
WM_operator_name_call_ptr(C, ot, WM_OP_EXEC_DEFAULT, &ptr);
|
2013-12-20 01:04:01 +11:00
|
|
|
WM_operator_properties_free(&ptr);
|
|
|
|
}
|
2010-01-20 04:19:55 +00:00
|
|
|
}
|
|
|
|
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
static double soft_range_round_up(double value, double max)
|
|
|
|
{
|
2012-11-07 11:28:50 +00:00
|
|
|
/* round up to .., 0.1, 0.2, 0.5, 1, 2, 5, 10, 20, 50, ..
|
|
|
|
* checking for 0.0 prevents floating point exceptions */
|
|
|
|
double newmax = (value != 0.0) ? pow(10.0, ceil(log(value) / M_LN10)) : 0.0;
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
if (newmax * 0.2 >= max && newmax * 0.2 >= value)
|
|
|
|
return newmax * 0.2;
|
|
|
|
else if (newmax * 0.5 >= max && newmax * 0.5 >= value)
|
|
|
|
return newmax * 0.5;
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
else
|
|
|
|
return newmax;
|
|
|
|
}
|
|
|
|
|
|
|
|
static double soft_range_round_down(double value, double max)
|
|
|
|
{
|
2012-11-07 11:28:50 +00:00
|
|
|
/* round down to .., 0.1, 0.2, 0.5, 1, 2, 5, 10, 20, 50, ..
|
|
|
|
* checking for 0.0 prevents floating point exceptions */
|
|
|
|
double newmax = (value != 0.0) ? pow(10.0, floor(log(value) / M_LN10)) : 0.0;
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
if (newmax * 5.0 <= max && newmax * 5.0 <= value)
|
|
|
|
return newmax * 5.0;
|
|
|
|
else if (newmax * 2.0 <= max && newmax * 2.0 <= value)
|
|
|
|
return newmax * 2.0;
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
else
|
|
|
|
return newmax;
|
|
|
|
}
|
|
|
|
|
2013-03-06 16:32:05 +00:00
|
|
|
/* note: this could be split up into functions which handle arrays and not */
|
|
|
|
static void ui_set_but_soft_range(uiBut *but)
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
{
|
2013-04-07 15:09:06 +00:00
|
|
|
/* ideally we would not limit this but practically, its more than
|
2011-06-16 06:47:54 +00:00
|
|
|
* enough worst case is very long vectors wont use a smart soft-range
|
2012-03-18 07:38:51 +00:00
|
|
|
* which isn't so bad. */
|
2011-06-16 06:47:54 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->rnaprop) {
|
2012-03-30 01:51:25 +00:00
|
|
|
const PropertyType type = RNA_property_type(but->rnaprop);
|
2011-06-16 06:47:54 +00:00
|
|
|
double softmin, softmax /*, step, precision*/;
|
2013-03-06 16:32:05 +00:00
|
|
|
double value_min;
|
|
|
|
double value_max;
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
|
2010-01-26 17:14:13 +00:00
|
|
|
/* clamp button range to something reasonable in case
|
|
|
|
* we get -inf/inf from RNA properties */
|
2012-03-24 06:38:07 +00:00
|
|
|
if (type == PROP_INT) {
|
2013-03-06 16:32:05 +00:00
|
|
|
const bool is_array = RNA_property_array_check(but->rnaprop);
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
int imin, imax, istep;
|
|
|
|
|
|
|
|
RNA_property_int_ui_range(&but->rnapoin, but->rnaprop, &imin, &imax, &istep);
|
2012-03-30 01:51:25 +00:00
|
|
|
softmin = (imin == INT_MIN) ? -1e4 : imin;
|
|
|
|
softmax = (imin == INT_MAX) ? 1e4 : imax;
|
2012-10-23 03:38:26 +00:00
|
|
|
/*step = istep;*/ /*UNUSED*/
|
|
|
|
/*precision = 1;*/ /*UNUSED*/
|
2011-06-16 06:47:54 +00:00
|
|
|
|
2013-03-06 16:32:05 +00:00
|
|
|
if (is_array) {
|
2011-06-16 06:47:54 +00:00
|
|
|
int value_range[2];
|
|
|
|
RNA_property_int_get_array_range(&but->rnapoin, but->rnaprop, value_range);
|
2012-03-30 01:51:25 +00:00
|
|
|
value_min = (double)value_range[0];
|
|
|
|
value_max = (double)value_range[1];
|
2011-06-16 06:47:54 +00:00
|
|
|
}
|
2013-03-06 16:32:05 +00:00
|
|
|
else {
|
|
|
|
value_min = value_max = (double)RNA_property_int_get(&but->rnapoin, but->rnaprop);
|
|
|
|
}
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
}
|
2012-03-24 06:38:07 +00:00
|
|
|
else if (type == PROP_FLOAT) {
|
2013-03-06 16:32:05 +00:00
|
|
|
const bool is_array = RNA_property_array_check(but->rnaprop);
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
float fmin, fmax, fstep, fprecision;
|
|
|
|
|
|
|
|
RNA_property_float_ui_range(&but->rnapoin, but->rnaprop, &fmin, &fmax, &fstep, &fprecision);
|
2012-03-30 01:51:25 +00:00
|
|
|
softmin = (fmin == -FLT_MAX) ? (float)-1e4 : fmin;
|
|
|
|
softmax = (fmax == FLT_MAX) ? (float)1e4 : fmax;
|
2012-10-23 03:38:26 +00:00
|
|
|
/*step = fstep;*/ /*UNUSED*/
|
|
|
|
/*precision = fprecision;*/ /*UNUSED*/
|
2011-06-16 06:47:54 +00:00
|
|
|
|
2013-03-06 16:32:05 +00:00
|
|
|
if (is_array) {
|
2011-06-16 06:47:54 +00:00
|
|
|
float value_range[2];
|
|
|
|
RNA_property_float_get_array_range(&but->rnapoin, but->rnaprop, value_range);
|
2012-03-30 01:51:25 +00:00
|
|
|
value_min = (double)value_range[0];
|
|
|
|
value_max = (double)value_range[1];
|
2011-06-16 06:47:54 +00:00
|
|
|
}
|
2013-03-06 16:32:05 +00:00
|
|
|
else {
|
|
|
|
value_min = value_max = (double)RNA_property_float_get(&but->rnapoin, but->rnaprop);
|
|
|
|
}
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
}
|
2013-03-06 16:32:05 +00:00
|
|
|
else {
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
return;
|
2013-03-06 16:32:05 +00:00
|
|
|
}
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
|
|
|
|
/* if the value goes out of the soft/max range, adapt the range */
|
2012-03-30 01:51:25 +00:00
|
|
|
if (value_min + 1e-10 < softmin) {
|
2012-03-24 06:38:07 +00:00
|
|
|
if (value_min < 0.0)
|
2012-03-30 01:51:25 +00:00
|
|
|
softmin = -soft_range_round_up(-value_min, -softmin);
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
else
|
2012-03-30 01:51:25 +00:00
|
|
|
softmin = soft_range_round_down(value_min, softmin);
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (softmin < (double)but->hardmin)
|
2012-03-30 01:51:25 +00:00
|
|
|
softmin = (double)but->hardmin;
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
}
|
2012-03-30 01:51:25 +00:00
|
|
|
if (value_max - 1e-10 > softmax) {
|
2012-03-24 06:38:07 +00:00
|
|
|
if (value_max < 0.0)
|
2012-03-30 01:51:25 +00:00
|
|
|
softmax = -soft_range_round_down(-value_max, -softmax);
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
else
|
2012-03-30 01:51:25 +00:00
|
|
|
softmax = soft_range_round_up(value_max, softmax);
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (softmax > (double)but->hardmax)
|
2012-03-30 01:51:25 +00:00
|
|
|
softmax = but->hardmax;
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
}
|
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
but->softmin = softmin;
|
|
|
|
but->softmax = softmax;
|
UI:
* Added support for soft/hard range in the buttons code. Currently
it works by only allowing to drag or click increment in the soft
range, but typing a number value allows to go outside it.
If the number is outside the soft range, the range will be extended,
rounded to values like:
.., 0.1, 0.2, 0.5, 1.0, 2.0, 5.0, 10.0, 20.0, 50.0, ..
2009-03-29 18:44:49 +00:00
|
|
|
}
|
2013-07-09 04:00:56 +00:00
|
|
|
else if (but->poin && (but->pointype & UI_BUT_POIN_TYPES)) {
|
2014-11-09 21:20:40 +01:00
|
|
|
float value = ui_but_value_get(but);
|
2016-06-29 12:00:17 +10:00
|
|
|
if (isfinite(value)) {
|
|
|
|
CLAMP(value, but->hardmin, but->hardmax);
|
|
|
|
but->softmin = min_ff(but->softmin, value);
|
|
|
|
but->softmax = max_ff(but->softmax, value);
|
|
|
|
}
|
2013-07-09 04:00:56 +00:00
|
|
|
}
|
|
|
|
else {
|
|
|
|
BLI_assert(0);
|
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
|
|
|
/* ******************* Free ********************/
|
|
|
|
|
2009-02-17 15:53:52 +00:00
|
|
|
/* can be called with C==NULL */
|
2014-11-09 21:20:40 +01:00
|
|
|
static void ui_but_free(const bContext *C, uiBut *but)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->opptr) {
|
2009-01-01 20:44:40 +00:00
|
|
|
WM_operator_properties_free(but->opptr);
|
|
|
|
MEM_freeN(but->opptr);
|
2008-12-16 20:03:28 +00:00
|
|
|
}
|
2012-01-21 22:00:40 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->func_argN) {
|
2012-01-21 22:00:40 +00:00
|
|
|
MEM_freeN(but->func_argN);
|
|
|
|
}
|
|
|
|
|
2015-02-11 00:06:03 +01:00
|
|
|
if (but->tip_argN) {
|
|
|
|
MEM_freeN(but->tip_argN);
|
|
|
|
}
|
|
|
|
|
2017-11-02 04:30:07 +11:00
|
|
|
if (but->hold_argN) {
|
|
|
|
MEM_freeN(but->hold_argN);
|
|
|
|
}
|
|
|
|
|
2017-05-12 01:42:42 +02:00
|
|
|
if (!but->editstr && but->free_search_arg) {
|
|
|
|
MEM_SAFE_FREE(but->search_arg);
|
|
|
|
}
|
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->active) {
|
2010-08-25 09:33:48 +00:00
|
|
|
/* XXX solve later, buttons should be free-able without context ideally,
|
2011-11-11 13:09:14 +00:00
|
|
|
* however they may have open tooltips or popup windows, which need to
|
|
|
|
* be closed using a context pointer */
|
2012-01-21 22:00:40 +00:00
|
|
|
if (C) {
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_active_free(C, but);
|
2012-01-21 22:00:40 +00:00
|
|
|
}
|
|
|
|
else {
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->active) {
|
2009-02-17 15:53:52 +00:00
|
|
|
MEM_freeN(but->active);
|
2012-01-21 22:00:40 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
if (but->str && but->str != but->strdata) {
|
|
|
|
MEM_freeN(but->str);
|
2009-02-17 15:53:52 +00:00
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
if ((but->type == UI_BTYPE_IMAGE) && but->poin) {
|
2012-01-21 22:00:40 +00:00
|
|
|
IMB_freeImBuf((struct ImBuf *)but->poin);
|
|
|
|
}
|
2011-09-17 16:57:37 +00:00
|
|
|
|
2015-08-10 18:01:11 +02:00
|
|
|
if (but->dragpoin && (but->dragflag & UI_BUT_DRAGPOIN_FREE)) {
|
|
|
|
MEM_freeN(but->dragpoin);
|
|
|
|
}
|
|
|
|
|
2014-02-08 09:03:25 +11:00
|
|
|
BLI_assert(UI_butstore_is_registered(but->block, but) == false);
|
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
MEM_freeN(but);
|
|
|
|
}
|
|
|
|
|
2009-02-17 15:53:52 +00:00
|
|
|
/* can be called with C==NULL */
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_free(const bContext *C, uiBlock *block)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
|
|
|
uiBut *but;
|
|
|
|
|
2014-02-08 09:03:25 +11:00
|
|
|
UI_butstore_clear(block);
|
|
|
|
|
2013-08-26 23:37:08 +00:00
|
|
|
while ((but = BLI_pophead(&block->buttons))) {
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_free(C, but);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2012-01-21 22:00:40 +00:00
|
|
|
if (block->unit) {
|
2011-09-22 14:55:39 +00:00
|
|
|
MEM_freeN(block->unit);
|
2012-01-21 22:00:40 +00:00
|
|
|
}
|
2011-09-22 14:55:39 +00:00
|
|
|
|
2012-01-21 22:00:40 +00:00
|
|
|
if (block->func_argN) {
|
2009-09-16 18:47:42 +00:00
|
|
|
MEM_freeN(block->func_argN);
|
2012-01-21 22:00:40 +00:00
|
|
|
}
|
2009-09-16 18:47:42 +00:00
|
|
|
|
2009-05-28 23:13:42 +00:00
|
|
|
CTX_store_free_list(&block->contexts);
|
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
BLI_freelistN(&block->saferct);
|
2014-11-06 20:19:21 +01:00
|
|
|
BLI_freelistN(&block->color_pickers.list);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
MEM_freeN(block);
|
|
|
|
}
|
|
|
|
|
2018-04-29 12:24:08 +02:00
|
|
|
void UI_blocklist_update_window_matrix(const bContext *C, const ListBase *lb)
|
|
|
|
{
|
|
|
|
ARegion *region = CTX_wm_region(C);
|
|
|
|
wmWindow *window = CTX_wm_window(C);
|
|
|
|
|
|
|
|
for (uiBlock *block = lb->first; block; block = block->next) {
|
|
|
|
if (block->active) {
|
|
|
|
ui_update_window_matrix(window, region, block);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
void UI_blocklist_draw(const bContext *C, const ListBase *lb)
|
|
|
|
{
|
|
|
|
for (uiBlock *block = lb->first; block; block = block->next) {
|
|
|
|
if (block->active) {
|
|
|
|
UI_block_draw(C, block);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2009-02-17 15:53:52 +00:00
|
|
|
/* can be called with C==NULL */
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_blocklist_free(const bContext *C, ListBase *lb)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
|
|
|
uiBlock *block;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2013-08-26 23:37:08 +00:00
|
|
|
while ((block = BLI_pophead(lb))) {
|
2014-11-09 21:20:40 +01:00
|
|
|
UI_block_free(C, block);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_blocklist_free_inactive(const bContext *C, ListBase *lb)
|
2008-12-26 13:11:04 +00:00
|
|
|
{
|
|
|
|
uiBlock *block, *nextblock;
|
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
for (block = lb->first; block; block = nextblock) {
|
|
|
|
nextblock = block->next;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (!block->handle) {
|
|
|
|
if (!block->active) {
|
2008-12-26 13:11:04 +00:00
|
|
|
BLI_remlink(lb, block);
|
2014-11-09 21:20:40 +01:00
|
|
|
UI_block_free(C, block);
|
2008-12-26 13:11:04 +00:00
|
|
|
}
|
|
|
|
else
|
2012-03-30 01:51:25 +00:00
|
|
|
block->active = 0;
|
2008-12-26 13:11:04 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_region_set(uiBlock *block, ARegion *region)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
ListBase *lb = ®ion->uiblocks;
|
|
|
|
uiBlock *oldblock = NULL;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
/* each listbase only has one block with this name, free block
|
|
|
|
* if is already there so it can be rebuilt from scratch */
|
2012-03-24 06:38:07 +00:00
|
|
|
if (lb) {
|
2012-03-30 01:51:25 +00:00
|
|
|
oldblock = BLI_findstring(lb, block->name, offsetof(uiBlock, name));
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2008-12-26 13:11:04 +00:00
|
|
|
if (oldblock) {
|
2012-03-30 01:51:25 +00:00
|
|
|
oldblock->active = 0;
|
|
|
|
oldblock->panel = NULL;
|
2014-06-15 01:40:15 +10:00
|
|
|
oldblock->handle = NULL;
|
2008-12-26 13:11:04 +00:00
|
|
|
}
|
2011-05-26 21:04:01 +00:00
|
|
|
|
|
|
|
/* at the beginning of the list! for dynamical menus/blocks */
|
|
|
|
BLI_addhead(lb, block);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
block->oldblock = oldblock;
|
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
uiBlock *UI_block_begin(const bContext *C, ARegion *region, const char *name, short dt)
|
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
|
|
|
{
|
|
|
|
uiBlock *block;
|
|
|
|
wmWindow *window;
|
Changes to Color Management
After testing and feedback, I've decided to slightly modify the way color
management works internally. While the previous method worked well for
rendering, was a smaller transition and had some advantages over this
new method, it was a bit more ambiguous, and was making things difficult
for other areas such as compositing.
This implementation now considers all color data (with only a couple of
exceptions such as brush colors) to be stored in linear RGB color space,
rather than sRGB as previously. This brings it in line with Nuke, which also
operates this way, quite successfully. Color swatches, pickers, color ramp
display are now gamma corrected to display gamma so you can see what
you're doing, but the numbers themselves are considered linear. This
makes understanding blending modes more clear (a 0.5 value on overlay
will not change the result now) as well as making color swatches act more
predictably in the compositor, however bringing over color values from
applications like photoshop or gimp, that operate in a gamma space,
will give identical results.
This commit will convert over existing files saved by earlier 2.5 versions to
work generally the same, though there may be some slight differences with
things like textures. Now that we're set on changing other areas of shading,
this won't be too disruptive overall.
I've made a diagram explaining the pipeline here:
http://mke3.net/blender/devel/2.5/25_linear_workflow_pipeline.png
and some docs here:
http://www.blender.org/development/release-logs/blender-250/color-management/
2009-12-02 07:56:34 +00:00
|
|
|
Scene *scn;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
window = CTX_wm_window(C);
|
Changes to Color Management
After testing and feedback, I've decided to slightly modify the way color
management works internally. While the previous method worked well for
rendering, was a smaller transition and had some advantages over this
new method, it was a bit more ambiguous, and was making things difficult
for other areas such as compositing.
This implementation now considers all color data (with only a couple of
exceptions such as brush colors) to be stored in linear RGB color space,
rather than sRGB as previously. This brings it in line with Nuke, which also
operates this way, quite successfully. Color swatches, pickers, color ramp
display are now gamma corrected to display gamma so you can see what
you're doing, but the numbers themselves are considered linear. This
makes understanding blending modes more clear (a 0.5 value on overlay
will not change the result now) as well as making color swatches act more
predictably in the compositor, however bringing over color values from
applications like photoshop or gimp, that operate in a gamma space,
will give identical results.
This commit will convert over existing files saved by earlier 2.5 versions to
work generally the same, though there may be some slight differences with
things like textures. Now that we're set on changing other areas of shading,
this won't be too disruptive overall.
I've made a diagram explaining the pipeline here:
http://mke3.net/blender/devel/2.5/25_linear_workflow_pipeline.png
and some docs here:
http://www.blender.org/development/release-logs/blender-250/color-management/
2009-12-02 07:56:34 +00:00
|
|
|
scn = CTX_data_scene(C);
|
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
block = MEM_callocN(sizeof(uiBlock), "uiBlock");
|
|
|
|
block->active = 1;
|
|
|
|
block->dt = dt;
|
2012-08-19 10:41:27 +00:00
|
|
|
block->evil_C = (void *)C; /* XXX */
|
2011-09-22 14:55:39 +00:00
|
|
|
|
2011-08-18 20:01:30 +00:00
|
|
|
if (scn) {
|
Color Management, Stage 2: Switch color pipeline to use OpenColorIO
Replace old color pipeline which was supporting linear/sRGB color spaces
only with OpenColorIO-based pipeline.
This introduces two configurable color spaces:
- Input color space for images and movie clips. This space is used to convert
images/movies from color space in which file is saved to Blender's linear
space (for float images, byte images are not internally converted, only input
space is stored for such images and used later).
This setting could be found in image/clip data block settings.
- Display color space which defines space in which particular display is working.
This settings could be found in scene's Color Management panel.
When render result is being displayed on the screen, apart from converting image
to display space, some additional conversions could happen.
This conversions are:
- View, which defines tone curve applying before display transformation.
These are different ways to view the image on the same display device.
For example it could be used to emulate film view on sRGB display.
- Exposure affects on image exposure before tone map is applied.
- Gamma is post-display gamma correction, could be used to match particular
display gamma.
- RGB curves are user-defined curves which are applying before display
transformation, could be used for different purposes.
All this settings by default are only applying on render result and does not
affect on other images. If some particular image needs to be affected by this
transformation, "View as Render" setting of image data block should be set to
truth. Movie clips are always affected by all display transformations.
This commit also introduces configurable color space in which sequencer is
working. This setting could be found in scene's Color Management panel and
it should be used if such stuff as grading needs to be done in color space
different from sRGB (i.e. when Film view on sRGB display is use, using VD16
space as sequencer's internal space would make grading working in space
which is close to the space using for display).
Some technical notes:
- Image buffer's float buffer is now always in linear space, even if it was
created from 16bit byte images.
- Space of byte buffer is stored in image buffer's rect_colorspace property.
- Profile of image buffer was removed since it's not longer meaningful.
- OpenGL and GLSL is supposed to always work in sRGB space. It is possible
to support other spaces, but it's quite large project which isn't so
much important.
- Legacy Color Management option disabled is emulated by using None display.
It could have some regressions, but there's no clear way to avoid them.
- If OpenColorIO is disabled on build time, it should make blender behaving
in the same way as previous release with color management enabled.
More details could be found at this page (more details would be added soon):
http://wiki.blender.org/index.php/Dev:Ref/Release_Notes/2.64/Color_Management
--
Thanks to Xavier Thomas, Lukas Toene for initial work on OpenColorIO
integration and to Brecht van Lommel for some further development and code/
usecase review!
2012-09-15 10:05:07 +00:00
|
|
|
/* store display device name, don't lookup for transformations yet
|
|
|
|
* block could be used for non-color displays where looking up for transformation
|
|
|
|
* would slow down redraw, so only lookup for actual transform when it's indeed
|
|
|
|
* needed
|
|
|
|
*/
|
2018-12-13 15:59:58 +01:00
|
|
|
STRNCPY(block->display_device, scn->display_settings.display_device);
|
2011-09-22 14:55:39 +00:00
|
|
|
|
|
|
|
/* copy to avoid crash when scene gets deleted with ui still open */
|
2012-03-30 01:51:25 +00:00
|
|
|
block->unit = MEM_mallocN(sizeof(scn->unit), "UI UnitSettings");
|
2011-09-22 14:55:39 +00:00
|
|
|
memcpy(block->unit, &scn->unit, sizeof(scn->unit));
|
2011-08-18 20:01:30 +00:00
|
|
|
}
|
2018-12-13 15:59:58 +01:00
|
|
|
else {
|
|
|
|
STRNCPY(block->display_device, IMB_colormanagement_display_get_default_name());
|
|
|
|
}
|
2011-09-22 14:55:39 +00:00
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
BLI_strncpy(block->name, name, sizeof(block->name));
|
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (region)
|
2014-11-09 21:20:40 +01:00
|
|
|
UI_block_region_set(block, region);
|
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
|
|
|
|
2018-04-29 12:24:08 +02:00
|
|
|
/* Set window matrix and aspect for region and OpenGL state. */
|
|
|
|
ui_update_window_matrix(window, region, block);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2018-04-29 12:24:08 +02:00
|
|
|
/* Tag as popup menu if not created within a region. */
|
|
|
|
if (!(region && region->visible)) {
|
2014-01-04 17:16:19 +11:00
|
|
|
block->auto_open = true;
|
2018-04-29 12:24:08 +02:00
|
|
|
block->flag |= UI_BLOCK_LOOP;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
|
|
|
return block;
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_emboss_set(uiBlock *block, char dt)
|
2.5
More cleanup!
- removed old UI font completely, including from uiBeginBlock
- emboss hints for uiBlock only have three types now;
Regular, Pulldown, or "Nothing" (only icon/text)
- removed old font path from Userdef
- removed all old button theme hinting
- removed old "auto block" to merge buttons in groups
(was only in use for radiosity buttons)
And went over all warnings. One hooray for make giving clean output :)
Well, we need uniform definitions for warnings, so people at least fix
them... here's the real bad bugs I found:
- in mesh code, a call to editmesh mixed *em and *me
- in armature, ED_util.h was not included, so no warnings for wrong call
to ED_undo_push()
- The extern Py api .h was not included in the bpy_interface.c, showing
a several calls using different args.
Further just added the missing includes, and removed unused vars.
2009-04-14 15:59:52 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
block->dt = dt;
|
2.5
More cleanup!
- removed old UI font completely, including from uiBeginBlock
- emboss hints for uiBlock only have three types now;
Regular, Pulldown, or "Nothing" (only icon/text)
- removed old font path from Userdef
- removed all old button theme hinting
- removed old "auto block" to merge buttons in groups
(was only in use for radiosity buttons)
And went over all warnings. One hooray for make giving clean output :)
Well, we need uniform definitions for warnings, so people at least fix
them... here's the real bad bugs I found:
- in mesh code, a call to editmesh mixed *em and *me
- in armature, ED_util.h was not included, so no warnings for wrong call
to ED_undo_push()
- The extern Py api .h was not included in the bpy_interface.c, showing
a several calls using different args.
Further just added the missing includes, and removed unused vars.
2009-04-14 15:59:52 +00:00
|
|
|
}
|
|
|
|
|
2018-09-11 10:56:08 +10:00
|
|
|
void UI_block_theme_style_set(uiBlock *block, char theme_style)
|
|
|
|
{
|
|
|
|
block->theme_style = theme_style;
|
|
|
|
}
|
|
|
|
|
2019-03-07 12:01:32 +01:00
|
|
|
static void ui_but_build_drawstr_float(uiBut *but, double value)
|
|
|
|
{
|
|
|
|
size_t slen = 0;
|
|
|
|
STR_CONCAT(but->drawstr, slen, but->str);
|
|
|
|
|
|
|
|
PropertySubType subtype = PROP_NONE;
|
|
|
|
if (but->rnaprop) {
|
|
|
|
subtype = RNA_property_subtype(but->rnaprop);
|
|
|
|
}
|
|
|
|
|
|
|
|
if (value == (double)FLT_MAX) {
|
|
|
|
STR_CONCAT(but->drawstr, slen, "inf");
|
|
|
|
}
|
|
|
|
else if (value == (double)-FLT_MIN) {
|
|
|
|
STR_CONCAT(but->drawstr, slen, "-inf");
|
|
|
|
}
|
|
|
|
else if (subtype == PROP_PERCENTAGE) {
|
|
|
|
int prec = ui_but_calc_float_precision(but, value);
|
|
|
|
STR_CONCATF(but->drawstr, slen, "%.*f %%", prec, value);
|
|
|
|
}
|
|
|
|
else if (subtype == PROP_PIXEL) {
|
|
|
|
int prec = ui_but_calc_float_precision(but, value);
|
|
|
|
STR_CONCATF(but->drawstr, slen, "%.*f px", prec, value);
|
|
|
|
}
|
2019-03-13 16:58:00 +01:00
|
|
|
else if (subtype == PROP_FACTOR) {
|
|
|
|
int precision = ui_but_calc_float_precision(but, value);
|
|
|
|
|
|
|
|
if (U.factor_display_type == USER_FACTOR_AS_FACTOR) {
|
|
|
|
STR_CONCATF(but->drawstr, slen, "%.*f", precision, value);
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
STR_CONCATF(but->drawstr, slen, "%.*f %%", MAX2(0, precision - 2), value * 100);
|
|
|
|
}
|
|
|
|
}
|
2019-03-07 12:01:32 +01:00
|
|
|
else if (ui_but_is_unit(but)) {
|
|
|
|
char new_str[sizeof(but->drawstr)];
|
|
|
|
ui_get_but_string_unit(but, new_str, sizeof(new_str), value, true, -1);
|
|
|
|
STR_CONCAT(but->drawstr, slen, new_str);
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
int prec = ui_but_calc_float_precision(but, value);
|
|
|
|
STR_CONCATF(but->drawstr, slen, "%.*f", prec, value);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
static void ui_but_build_drawstr_int(uiBut *but, int value)
|
|
|
|
{
|
|
|
|
size_t slen = 0;
|
|
|
|
STR_CONCAT(but->drawstr, slen, but->str);
|
|
|
|
|
|
|
|
PropertySubType subtype = PROP_NONE;
|
|
|
|
if (but->rnaprop) {
|
|
|
|
subtype = RNA_property_subtype(but->rnaprop);
|
|
|
|
}
|
|
|
|
|
|
|
|
STR_CONCATF(but->drawstr, slen, "%d", value);
|
|
|
|
|
|
|
|
if (subtype == PROP_PERCENTAGE) {
|
|
|
|
STR_CONCAT(but->drawstr, slen, "%");
|
|
|
|
}
|
|
|
|
else if (subtype == PROP_PIXEL) {
|
|
|
|
STR_CONCAT(but->drawstr, slen, " px");
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2016-02-23 09:55:16 +11:00
|
|
|
/**
|
|
|
|
* \param but: Button to update.
|
|
|
|
* \param validate: When set, this function may change the button value.
|
|
|
|
* Otherwise treat the button value as read-only.
|
|
|
|
*/
|
2019-03-14 18:27:17 +01:00
|
|
|
static void ui_but_update_ex(uiBut *but, const bool validate)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
|
|
|
/* if something changed in the button */
|
2012-03-30 01:51:25 +00:00
|
|
|
double value = UI_BUT_VALUE_UNSET;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update_select_flag(but, &value);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2010-03-12 14:18:14 +00:00
|
|
|
/* only update soft range while not editing */
|
2019-03-14 18:27:17 +01:00
|
|
|
if (!ui_but_is_editing(but)) {
|
2013-07-09 04:00:56 +00:00
|
|
|
if ((but->rnaprop != NULL) ||
|
|
|
|
(but->poin && (but->pointype & UI_BUT_POIN_TYPES)))
|
|
|
|
{
|
|
|
|
ui_set_but_soft_range(but);
|
|
|
|
}
|
2011-06-16 06:00:02 +00:00
|
|
|
}
|
2010-03-12 14:18:14 +00:00
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
/* test for min and max, icon sliders, etc */
|
2012-03-30 01:51:25 +00:00
|
|
|
switch (but->type) {
|
2014-11-09 21:20:40 +01:00
|
|
|
case UI_BTYPE_NUM:
|
|
|
|
case UI_BTYPE_SCROLL:
|
|
|
|
case UI_BTYPE_NUM_SLIDER:
|
2016-02-23 09:55:16 +11:00
|
|
|
if (validate) {
|
|
|
|
UI_GET_BUT_VALUE_INIT(but, value);
|
|
|
|
if (value < (double)but->hardmin) {
|
|
|
|
ui_but_value_set(but, but->hardmin);
|
|
|
|
}
|
|
|
|
else if (value > (double)but->hardmax) {
|
|
|
|
ui_but_value_set(but, but->hardmax);
|
|
|
|
}
|
2015-01-06 00:02:57 +01:00
|
|
|
|
2016-02-23 09:55:16 +11:00
|
|
|
/* max must never be smaller than min! Both being equal is allowed though */
|
|
|
|
BLI_assert(but->softmin <= but->softmax &&
|
|
|
|
but->hardmin <= but->hardmax);
|
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
break;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2015-01-06 00:02:57 +01:00
|
|
|
case UI_BTYPE_ICON_TOGGLE:
|
2014-11-09 21:20:40 +01:00
|
|
|
case UI_BTYPE_ICON_TOGGLE_N:
|
2018-10-29 16:58:34 +01:00
|
|
|
if ((but->rnaprop == NULL) || (RNA_property_flag(but->rnaprop) & PROP_ICONS_CONSECUTIVE)) {
|
|
|
|
if (but->rnaprop && RNA_property_flag(but->rnaprop) & PROP_ICONS_REVERSE) {
|
|
|
|
but->drawflag |= UI_BUT_ICON_REVERSE;
|
|
|
|
}
|
|
|
|
|
2018-10-25 13:41:32 +00:00
|
|
|
but->iconadd = (but->flag & UI_SELECT) ? 1 : 0;
|
2009-06-16 01:08:39 +00:00
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
break;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2012-09-10 06:05:19 +00:00
|
|
|
/* quiet warnings for unhandled types */
|
|
|
|
default:
|
|
|
|
break;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2018-05-23 10:47:12 +02:00
|
|
|
|
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
/* safety is 4 to enable small number buttons (like 'users') */
|
2012-10-23 03:38:26 +00:00
|
|
|
// okwidth = -4 + (BLI_rcti_size_x(&but->rect)); // UNUSED
|
2018-05-23 10:47:12 +02:00
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
/* name: */
|
2012-03-30 01:51:25 +00:00
|
|
|
switch (but->type) {
|
2014-02-19 11:30:06 +11:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
case UI_BTYPE_MENU:
|
2018-05-25 10:45:48 +02:00
|
|
|
if (BLI_rctf_size_x(&but->rect) >= (UI_UNIT_X * 2)) {
|
2014-02-19 11:30:06 +11:00
|
|
|
/* only needed for menus in popup blocks that don't recreate buttons on redraw */
|
|
|
|
if (but->block->flag & UI_BLOCK_LOOP) {
|
|
|
|
if (but->rnaprop && (RNA_property_type(but->rnaprop) == PROP_ENUM)) {
|
2015-11-23 10:01:54 +11:00
|
|
|
int value_enum = RNA_property_enum_get(&but->rnapoin, but->rnaprop);
|
2016-03-15 21:12:37 +11:00
|
|
|
|
|
|
|
EnumPropertyItem item;
|
|
|
|
if (RNA_property_enum_item_from_value_gettexted(
|
2018-11-21 06:21:58 +11:00
|
|
|
but->block->evil_C,
|
|
|
|
&but->rnapoin, but->rnaprop, value_enum, &item))
|
2014-02-19 11:30:06 +11:00
|
|
|
{
|
2016-03-15 21:12:37 +11:00
|
|
|
size_t slen = strlen(item.name);
|
2015-01-06 11:05:08 +11:00
|
|
|
ui_but_string_free_internal(but);
|
2016-03-15 21:12:37 +11:00
|
|
|
ui_but_string_set_internal(but, item.name, slen);
|
|
|
|
but->icon = item.icon;
|
2014-02-19 11:30:06 +11:00
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
BLI_strncpy(but->drawstr, but->str, sizeof(but->drawstr));
|
|
|
|
}
|
|
|
|
break;
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
case UI_BTYPE_NUM:
|
|
|
|
case UI_BTYPE_NUM_SLIDER:
|
2019-03-07 12:01:32 +01:00
|
|
|
if (but->editstr) {
|
|
|
|
break;
|
|
|
|
}
|
|
|
|
UI_GET_BUT_VALUE_INIT(but, value);
|
|
|
|
if (ui_but_is_float(but)) {
|
|
|
|
ui_but_build_drawstr_float(but, value);
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
ui_but_build_drawstr_int(but, (int)value);
|
2012-03-30 01:51:25 +00:00
|
|
|
}
|
|
|
|
break;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
case UI_BTYPE_LABEL:
|
|
|
|
if (ui_but_is_float(but)) {
|
2012-03-30 01:51:25 +00:00
|
|
|
int prec;
|
2012-05-27 12:21:13 +00:00
|
|
|
UI_GET_BUT_VALUE_INIT(but, value);
|
2014-11-09 21:20:40 +01:00
|
|
|
prec = ui_but_calc_float_precision(but, value);
|
2012-03-30 01:51:25 +00:00
|
|
|
BLI_snprintf(but->drawstr, sizeof(but->drawstr), "%s%.*f", but->str, prec, value);
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
BLI_strncpy(but->drawstr, but->str, UI_MAX_DRAW_STR);
|
|
|
|
}
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
break;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
case UI_BTYPE_TEXT:
|
|
|
|
case UI_BTYPE_SEARCH_MENU:
|
2012-03-30 01:51:25 +00:00
|
|
|
if (!but->editstr) {
|
|
|
|
char str[UI_MAX_DRAW_STR];
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_string_get(but, str, UI_MAX_DRAW_STR);
|
2012-03-30 01:51:25 +00:00
|
|
|
BLI_snprintf(but->drawstr, sizeof(but->drawstr), "%s%s", but->str, str);
|
|
|
|
}
|
|
|
|
break;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
case UI_BTYPE_KEY_EVENT:
|
2013-07-25 13:18:11 +00:00
|
|
|
{
|
|
|
|
const char *str;
|
2012-03-30 01:51:25 +00:00
|
|
|
if (but->flag & UI_SELECT) {
|
2013-07-25 13:18:11 +00:00
|
|
|
str = "Press a key";
|
2012-03-30 01:51:25 +00:00
|
|
|
}
|
|
|
|
else {
|
2012-05-27 12:21:13 +00:00
|
|
|
UI_GET_BUT_VALUE_INIT(but, value);
|
2015-07-03 15:07:46 +02:00
|
|
|
str = WM_key_event_string((short)value, false);
|
2012-03-30 01:51:25 +00:00
|
|
|
}
|
2013-07-25 13:18:11 +00:00
|
|
|
BLI_snprintf(but->drawstr, UI_MAX_DRAW_STR, "%s%s", but->str, str);
|
2012-03-30 01:51:25 +00:00
|
|
|
break;
|
2013-07-25 13:18:11 +00:00
|
|
|
}
|
2014-11-09 21:20:40 +01:00
|
|
|
case UI_BTYPE_HOTKEY_EVENT:
|
2012-03-30 01:51:25 +00:00
|
|
|
if (but->flag & UI_SELECT) {
|
|
|
|
|
|
|
|
if (but->modifier_key) {
|
|
|
|
char *str = but->drawstr;
|
2013-07-25 13:18:11 +00:00
|
|
|
but->drawstr[0] = '\0';
|
2012-03-30 01:51:25 +00:00
|
|
|
|
|
|
|
if (but->modifier_key & KM_SHIFT)
|
2013-06-16 08:29:02 +00:00
|
|
|
str += BLI_strcpy_rlen(str, "Shift ");
|
2012-03-30 01:51:25 +00:00
|
|
|
if (but->modifier_key & KM_CTRL)
|
2013-06-16 08:29:02 +00:00
|
|
|
str += BLI_strcpy_rlen(str, "Ctrl ");
|
2012-03-30 01:51:25 +00:00
|
|
|
if (but->modifier_key & KM_ALT)
|
2013-06-16 08:29:02 +00:00
|
|
|
str += BLI_strcpy_rlen(str, "Alt ");
|
2012-03-30 01:51:25 +00:00
|
|
|
if (but->modifier_key & KM_OSKEY)
|
2013-06-16 08:29:02 +00:00
|
|
|
str += BLI_strcpy_rlen(str, "Cmd ");
|
2012-03-30 01:51:25 +00:00
|
|
|
|
|
|
|
(void)str; /* UNUSED */
|
|
|
|
}
|
2013-07-25 13:18:11 +00:00
|
|
|
else {
|
|
|
|
BLI_strncpy(but->drawstr, "Press a key", UI_MAX_DRAW_STR);
|
|
|
|
}
|
2009-07-28 16:48:02 +00:00
|
|
|
}
|
|
|
|
else
|
2012-03-30 01:51:25 +00:00
|
|
|
BLI_strncpy(but->drawstr, but->str, UI_MAX_DRAW_STR);
|
|
|
|
|
|
|
|
break;
|
Key Configuration
Keymaps are now saveable and configurable from the user preferences, note
that editing one item in a keymap means the whole keymap is now defined by
the user and will not be updated by Blender, an option for syncing might be
added later. The outliner interface is still there, but I will probably
remove it.
There's actually 3 levels now:
* Default builtin key configuration.
* Key configuration loaded from .py file, for configs like Blender 2.4x
or other 3D applications.
* Keymaps edited by the user and saved in .B.blend. These can be saved
to .py files as well to make creating distributable configurations
easier.
Also, user preferences sections were reorganized a bit, now there is:
Interface, Editing, Input, Files and System.
Implementation notes:
* wmKeyConfig was added which represents a key configuration containing
keymaps.
* wmKeymapItem was renamed to wmKeyMapItem for consistency with wmKeyMap.
* Modal maps are not wrapped yet.
* User preferences DNA file reading did not support newdataadr() yet,
added this now for reading keymaps.
* Key configuration related settings are now RNA wrapped.
* is_property_set and is_property_hidden python methods were added.
2009-10-08 18:40:03 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
case UI_BTYPE_HSVCUBE:
|
|
|
|
case UI_BTYPE_HSVCIRCLE:
|
2012-03-30 01:51:25 +00:00
|
|
|
break;
|
|
|
|
default:
|
|
|
|
BLI_strncpy(but->drawstr, but->str, UI_MAX_DRAW_STR);
|
2013-07-21 08:16:37 +00:00
|
|
|
break;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
|
|
|
/* if we are doing text editing, this will override the drawstr */
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->editstr)
|
2014-05-03 09:18:00 +10:00
|
|
|
but->drawstr[0] = '\0';
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2009-04-10 14:06:24 +00:00
|
|
|
/* text clipping moved to widget drawing code itself */
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2016-02-23 09:55:16 +11:00
|
|
|
void ui_but_update(uiBut *but)
|
|
|
|
{
|
|
|
|
ui_but_update_ex(but, false);
|
|
|
|
}
|
|
|
|
|
|
|
|
void ui_but_update_edited(uiBut *but)
|
|
|
|
{
|
|
|
|
ui_but_update_ex(but, true);
|
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_align_begin(uiBlock *block)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
|
|
|
/* if other align was active, end it */
|
2014-11-09 21:20:40 +01:00
|
|
|
if (block->flag & UI_BUT_ALIGN) UI_block_align_end(block);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-10-21 05:46:41 +00:00
|
|
|
block->flag |= UI_BUT_ALIGN_DOWN;
|
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
|
|
|
block->alignnr++;
|
|
|
|
|
|
|
|
/* buttons declared after this call will get this align nr */ // XXX flag?
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_align_end(uiBlock *block)
|
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
|
|
|
{
|
2012-08-19 10:41:27 +00:00
|
|
|
block->flag &= ~UI_BUT_ALIGN; /* all 4 flags */
|
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
struct ColorManagedDisplay *ui_block_cm_display_get(uiBlock *block)
|
Color Management, Stage 2: Switch color pipeline to use OpenColorIO
Replace old color pipeline which was supporting linear/sRGB color spaces
only with OpenColorIO-based pipeline.
This introduces two configurable color spaces:
- Input color space for images and movie clips. This space is used to convert
images/movies from color space in which file is saved to Blender's linear
space (for float images, byte images are not internally converted, only input
space is stored for such images and used later).
This setting could be found in image/clip data block settings.
- Display color space which defines space in which particular display is working.
This settings could be found in scene's Color Management panel.
When render result is being displayed on the screen, apart from converting image
to display space, some additional conversions could happen.
This conversions are:
- View, which defines tone curve applying before display transformation.
These are different ways to view the image on the same display device.
For example it could be used to emulate film view on sRGB display.
- Exposure affects on image exposure before tone map is applied.
- Gamma is post-display gamma correction, could be used to match particular
display gamma.
- RGB curves are user-defined curves which are applying before display
transformation, could be used for different purposes.
All this settings by default are only applying on render result and does not
affect on other images. If some particular image needs to be affected by this
transformation, "View as Render" setting of image data block should be set to
truth. Movie clips are always affected by all display transformations.
This commit also introduces configurable color space in which sequencer is
working. This setting could be found in scene's Color Management panel and
it should be used if such stuff as grading needs to be done in color space
different from sRGB (i.e. when Film view on sRGB display is use, using VD16
space as sequencer's internal space would make grading working in space
which is close to the space using for display).
Some technical notes:
- Image buffer's float buffer is now always in linear space, even if it was
created from 16bit byte images.
- Space of byte buffer is stored in image buffer's rect_colorspace property.
- Profile of image buffer was removed since it's not longer meaningful.
- OpenGL and GLSL is supposed to always work in sRGB space. It is possible
to support other spaces, but it's quite large project which isn't so
much important.
- Legacy Color Management option disabled is emulated by using None display.
It could have some regressions, but there's no clear way to avoid them.
- If OpenColorIO is disabled on build time, it should make blender behaving
in the same way as previous release with color management enabled.
More details could be found at this page (more details would be added soon):
http://wiki.blender.org/index.php/Dev:Ref/Release_Notes/2.64/Color_Management
--
Thanks to Xavier Thomas, Lukas Toene for initial work on OpenColorIO
integration and to Brecht van Lommel for some further development and code/
usecase review!
2012-09-15 10:05:07 +00:00
|
|
|
{
|
|
|
|
return IMB_colormanagement_display_get_named(block->display_device);
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void ui_block_cm_to_display_space_v3(uiBlock *block, float pixel[3])
|
Color Management, Stage 2: Switch color pipeline to use OpenColorIO
Replace old color pipeline which was supporting linear/sRGB color spaces
only with OpenColorIO-based pipeline.
This introduces two configurable color spaces:
- Input color space for images and movie clips. This space is used to convert
images/movies from color space in which file is saved to Blender's linear
space (for float images, byte images are not internally converted, only input
space is stored for such images and used later).
This setting could be found in image/clip data block settings.
- Display color space which defines space in which particular display is working.
This settings could be found in scene's Color Management panel.
When render result is being displayed on the screen, apart from converting image
to display space, some additional conversions could happen.
This conversions are:
- View, which defines tone curve applying before display transformation.
These are different ways to view the image on the same display device.
For example it could be used to emulate film view on sRGB display.
- Exposure affects on image exposure before tone map is applied.
- Gamma is post-display gamma correction, could be used to match particular
display gamma.
- RGB curves are user-defined curves which are applying before display
transformation, could be used for different purposes.
All this settings by default are only applying on render result and does not
affect on other images. If some particular image needs to be affected by this
transformation, "View as Render" setting of image data block should be set to
truth. Movie clips are always affected by all display transformations.
This commit also introduces configurable color space in which sequencer is
working. This setting could be found in scene's Color Management panel and
it should be used if such stuff as grading needs to be done in color space
different from sRGB (i.e. when Film view on sRGB display is use, using VD16
space as sequencer's internal space would make grading working in space
which is close to the space using for display).
Some technical notes:
- Image buffer's float buffer is now always in linear space, even if it was
created from 16bit byte images.
- Space of byte buffer is stored in image buffer's rect_colorspace property.
- Profile of image buffer was removed since it's not longer meaningful.
- OpenGL and GLSL is supposed to always work in sRGB space. It is possible
to support other spaces, but it's quite large project which isn't so
much important.
- Legacy Color Management option disabled is emulated by using None display.
It could have some regressions, but there's no clear way to avoid them.
- If OpenColorIO is disabled on build time, it should make blender behaving
in the same way as previous release with color management enabled.
More details could be found at this page (more details would be added soon):
http://wiki.blender.org/index.php/Dev:Ref/Release_Notes/2.64/Color_Management
--
Thanks to Xavier Thomas, Lukas Toene for initial work on OpenColorIO
integration and to Brecht van Lommel for some further development and code/
usecase review!
2012-09-15 10:05:07 +00:00
|
|
|
{
|
2014-11-09 21:20:40 +01:00
|
|
|
struct ColorManagedDisplay *display = ui_block_cm_display_get(block);
|
Color Management, Stage 2: Switch color pipeline to use OpenColorIO
Replace old color pipeline which was supporting linear/sRGB color spaces
only with OpenColorIO-based pipeline.
This introduces two configurable color spaces:
- Input color space for images and movie clips. This space is used to convert
images/movies from color space in which file is saved to Blender's linear
space (for float images, byte images are not internally converted, only input
space is stored for such images and used later).
This setting could be found in image/clip data block settings.
- Display color space which defines space in which particular display is working.
This settings could be found in scene's Color Management panel.
When render result is being displayed on the screen, apart from converting image
to display space, some additional conversions could happen.
This conversions are:
- View, which defines tone curve applying before display transformation.
These are different ways to view the image on the same display device.
For example it could be used to emulate film view on sRGB display.
- Exposure affects on image exposure before tone map is applied.
- Gamma is post-display gamma correction, could be used to match particular
display gamma.
- RGB curves are user-defined curves which are applying before display
transformation, could be used for different purposes.
All this settings by default are only applying on render result and does not
affect on other images. If some particular image needs to be affected by this
transformation, "View as Render" setting of image data block should be set to
truth. Movie clips are always affected by all display transformations.
This commit also introduces configurable color space in which sequencer is
working. This setting could be found in scene's Color Management panel and
it should be used if such stuff as grading needs to be done in color space
different from sRGB (i.e. when Film view on sRGB display is use, using VD16
space as sequencer's internal space would make grading working in space
which is close to the space using for display).
Some technical notes:
- Image buffer's float buffer is now always in linear space, even if it was
created from 16bit byte images.
- Space of byte buffer is stored in image buffer's rect_colorspace property.
- Profile of image buffer was removed since it's not longer meaningful.
- OpenGL and GLSL is supposed to always work in sRGB space. It is possible
to support other spaces, but it's quite large project which isn't so
much important.
- Legacy Color Management option disabled is emulated by using None display.
It could have some regressions, but there's no clear way to avoid them.
- If OpenColorIO is disabled on build time, it should make blender behaving
in the same way as previous release with color management enabled.
More details could be found at this page (more details would be added soon):
http://wiki.blender.org/index.php/Dev:Ref/Release_Notes/2.64/Color_Management
--
Thanks to Xavier Thomas, Lukas Toene for initial work on OpenColorIO
integration and to Brecht van Lommel for some further development and code/
usecase review!
2012-09-15 10:05:07 +00:00
|
|
|
|
|
|
|
IMB_colormanagement_scene_linear_to_display_v3(pixel, display);
|
|
|
|
}
|
|
|
|
|
UI: New Global Top-Bar (WIP)
== Main Features/Changes for Users
* Add horizontal bar at top of all non-temp windows, consisting out of two horizontal sub-bars.
* Upper sub-bar contains global menus (File, Render, etc.), tabs for workspaces and scene selector.
* Lower sub-bar contains object mode selector, screen-layout and render-layer selector. Later operator and/or tool settings will be placed here.
* Individual sections of the topbar are individually scrollable.
* Workspace tabs can be double- or ctrl-clicked for renaming and contain 'x' icon for deleting.
* Top-bar should scale nicely with DPI.
* The lower half of the top-bar can be hided by dragging the lower top-bar edge up. Better hiding options are planned (e.g. hide in fullscreen modes).
* Info editors at the top of the window and using the full window width with be replaced by the top-bar.
* In fullscreen modes, no more info editor is added on top, the top-bar replaces it.
== Technical Features/Changes
* Adds initial support for global areas
A global area is part of the window, not part of the regular screen-layout.
I've added a macro iterator to iterate over both, global and screen-layout level areas. When iterating over areas, from now on developers should always consider if they have to include global areas.
* Adds a TOPBAR editor type
The editor type is hidden in the UI editor type menu.
* Adds a variation of the ID template to display IDs as tab buttons (template_ID_tabs in BPY)
* Does various changes to RNA button creation code to improve their appearance in the horizontal top-bar.
* Adds support for dynamically sized regions. That is, regions that scale automatically to the layout bounds.
The code for this is currently a big hack (it's based on drawing the UI multiple times). This should definitely be improved.
* Adds a template for displaying operator properties optimized for the top-bar. This will probably change a lot still and is in fact disabled in code.
Since the final top-bar design depends a lot on other 2.8 designs (mainly tool-system and workspaces), we decided to not show the operator or tool settings in the top-bar for now. That means most of the lower sub-bar is empty for the time being.
NOTE: Top-bar or global area data is not written to files or SDNA. They are simply added to the window when opening Blender or reading a file. This allows us doing changes to the top-bar without having to care for compatibility.
== ToDo's
It's a bit hard to predict all the ToDo's here are the known main ones:
* Add options for the new active-tool system and for operator redo to the topbar.
* Automatically hide the top-bar in fullscreen modes.
* General visual polish.
* Top-bar drag & drop support (WIP in temp-tab_drag_drop).
* Improve dynamic regions (should also fix some layout glitches).
* Make internal terminology consistent.
* Enable topbar file writing once design is more advanced.
* Address TODO's and XXX's in code :)
Thanks @brecht for the review! And @sergey for the complaining ;)
Differential Revision: D2758
2018-04-20 17:14:03 +02:00
|
|
|
static uiBut *ui_but_alloc(const eButType type)
|
|
|
|
{
|
|
|
|
switch (type) {
|
|
|
|
case UI_BTYPE_TAB:
|
|
|
|
return MEM_callocN(sizeof(uiButTab), "uiButTab");
|
|
|
|
default:
|
|
|
|
return MEM_callocN(sizeof(uiBut), "uiBut");
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2012-03-03 16:31:46 +00:00
|
|
|
/**
|
|
|
|
* \brief ui_def_but is the function that draws many button types
|
|
|
|
*
|
2015-05-31 14:20:03 +10:00
|
|
|
* \param x, y: The lower left hand corner of the button (X axis)
|
|
|
|
* \param width, height: The size of the button.
|
2012-08-18 18:11:51 +00:00
|
|
|
*
|
2011-11-11 13:09:14 +00:00
|
|
|
* for float buttons:
|
2015-05-31 14:20:03 +10:00
|
|
|
* \param a1: Click Step (how much to change the value each click)
|
|
|
|
* \param a2: Number of decimal point values to display. 0 defaults to 3 (0.000)
|
|
|
|
* 1,2,3, and a maximum of 4, all greater values will be clamped to 4.
|
2011-11-11 13:09:14 +00:00
|
|
|
*/
|
2015-05-05 03:13:47 +10:00
|
|
|
static uiBut *ui_def_but(
|
|
|
|
uiBlock *block, int type, int retval, const char *str,
|
|
|
|
int x, int y, short width, short height,
|
|
|
|
void *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
|
|
|
uiBut *but;
|
2010-12-05 23:50:55 +00:00
|
|
|
int slen;
|
2012-09-14 05:44:47 +00:00
|
|
|
|
2013-07-09 04:00:56 +00:00
|
|
|
BLI_assert(width >= 0 && height >= 0);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2012-09-14 05:44:47 +00:00
|
|
|
/* we could do some more error checks here */
|
2014-11-09 21:20:40 +01:00
|
|
|
if ((type & BUTTYPE) == UI_BTYPE_LABEL) {
|
2014-01-04 17:16:19 +11:00
|
|
|
BLI_assert((poin != NULL || min != 0.0f || max != 0.0f || (a1 == 0.0f && a2 != 0.0f) || (a1 != 0.0f && a1 != 1.0f)) == false);
|
2012-09-14 05:44:47 +00:00
|
|
|
}
|
|
|
|
|
|
|
|
if (type & UI_BUT_POIN_TYPES) { /* a pointer is required */
|
|
|
|
if (poin == NULL) {
|
|
|
|
BLI_assert(0);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
return NULL;
|
2012-09-14 05:44:47 +00:00
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
UI: New Global Top-Bar (WIP)
== Main Features/Changes for Users
* Add horizontal bar at top of all non-temp windows, consisting out of two horizontal sub-bars.
* Upper sub-bar contains global menus (File, Render, etc.), tabs for workspaces and scene selector.
* Lower sub-bar contains object mode selector, screen-layout and render-layer selector. Later operator and/or tool settings will be placed here.
* Individual sections of the topbar are individually scrollable.
* Workspace tabs can be double- or ctrl-clicked for renaming and contain 'x' icon for deleting.
* Top-bar should scale nicely with DPI.
* The lower half of the top-bar can be hided by dragging the lower top-bar edge up. Better hiding options are planned (e.g. hide in fullscreen modes).
* Info editors at the top of the window and using the full window width with be replaced by the top-bar.
* In fullscreen modes, no more info editor is added on top, the top-bar replaces it.
== Technical Features/Changes
* Adds initial support for global areas
A global area is part of the window, not part of the regular screen-layout.
I've added a macro iterator to iterate over both, global and screen-layout level areas. When iterating over areas, from now on developers should always consider if they have to include global areas.
* Adds a TOPBAR editor type
The editor type is hidden in the UI editor type menu.
* Adds a variation of the ID template to display IDs as tab buttons (template_ID_tabs in BPY)
* Does various changes to RNA button creation code to improve their appearance in the horizontal top-bar.
* Adds support for dynamically sized regions. That is, regions that scale automatically to the layout bounds.
The code for this is currently a big hack (it's based on drawing the UI multiple times). This should definitely be improved.
* Adds a template for displaying operator properties optimized for the top-bar. This will probably change a lot still and is in fact disabled in code.
Since the final top-bar design depends a lot on other 2.8 designs (mainly tool-system and workspaces), we decided to not show the operator or tool settings in the top-bar for now. That means most of the lower sub-bar is empty for the time being.
NOTE: Top-bar or global area data is not written to files or SDNA. They are simply added to the window when opening Blender or reading a file. This allows us doing changes to the top-bar without having to care for compatibility.
== ToDo's
It's a bit hard to predict all the ToDo's here are the known main ones:
* Add options for the new active-tool system and for operator redo to the topbar.
* Automatically hide the top-bar in fullscreen modes.
* General visual polish.
* Top-bar drag & drop support (WIP in temp-tab_drag_drop).
* Improve dynamic regions (should also fix some layout glitches).
* Make internal terminology consistent.
* Enable topbar file writing once design is more advanced.
* Address TODO's and XXX's in code :)
Thanks @brecht for the review! And @sergey for the complaining ;)
Differential Revision: D2758
2018-04-20 17:14:03 +02:00
|
|
|
but = ui_but_alloc(type & BUTTYPE);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
but->type = type & BUTTYPE;
|
2012-09-12 00:32:33 +00:00
|
|
|
but->pointype = type & UI_BUT_POIN_TYPES;
|
|
|
|
but->bit = type & UI_BUT_POIN_BIT;
|
2012-03-30 01:51:25 +00:00
|
|
|
but->bitnr = type & 31;
|
2011-09-30 15:22:13 +00:00
|
|
|
but->icon = ICON_NONE;
|
2012-03-30 01:51:25 +00:00
|
|
|
but->iconadd = 0;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
but->retval = retval;
|
2010-12-05 23:50:55 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
slen = strlen(str);
|
2015-01-06 11:05:08 +11:00
|
|
|
ui_but_string_set_internal(but, str, slen);
|
2010-12-05 23:50:55 +00:00
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
but->rect.xmin = x;
|
|
|
|
but->rect.ymin = y;
|
|
|
|
but->rect.xmax = but->rect.xmin + width;
|
|
|
|
but->rect.ymax = but->rect.ymin + height;
|
2012-03-30 01:51:25 +00:00
|
|
|
|
|
|
|
but->poin = poin;
|
|
|
|
but->hardmin = but->softmin = min;
|
|
|
|
but->hardmax = but->softmax = max;
|
|
|
|
but->a1 = a1;
|
|
|
|
but->a2 = a2;
|
|
|
|
but->tip = tip;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2016-09-19 02:25:59 +02:00
|
|
|
but->disabled_info = block->lockstr;
|
2012-03-30 01:51:25 +00:00
|
|
|
but->dt = block->dt;
|
Pie Menus C code backend.
This commit merges the code in the pie-menu branch.
As per decisions taken the last few days, there are no pie menus
included and there will be an official add-on including overrides of
some keys with pie menus. However, people will now be able to use the
new code in python.
Full Documentation is in http://wiki.blender.org/index.php/Dev:Ref/
Thanks:
Campbell Barton, Dalai Felinto and Ton Roosendaal for the code review
and design comments
Jonathan Williamson, Pawel Lyczkowski, Pablo Vazquez among others for
suggestions during the development.
Special Thanks to Sean Olson, for his support, suggestions, testing and
merciless bugging so that I would finish the pie menu code. Without him
we wouldn't be here. Also to the rest of the developers of the original
python add-on, Patrick Moore and Dan Eicher and finally to Matt Ebb, who
did the research and first implementation and whose code I used to get
started.
2014-08-11 10:39:59 +02:00
|
|
|
but->pie_dir = UI_RADIAL_NONE;
|
2012-03-30 01:51:25 +00:00
|
|
|
|
2012-08-19 10:41:27 +00:00
|
|
|
but->block = block; /* pointer back, used for frontbuffer status, and picker */
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if ((block->flag & UI_BUT_ALIGN) && ui_but_can_align(but))
|
2012-03-30 01:51:25 +00:00
|
|
|
but->alignnr = block->alignnr;
|
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
but->func = block->func;
|
|
|
|
but->func_arg1 = block->func_arg1;
|
|
|
|
but->func_arg2 = block->func_arg2;
|
2009-09-16 18:47:42 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
but->funcN = block->funcN;
|
2012-03-24 06:38:07 +00:00
|
|
|
if (block->func_argN)
|
2012-03-30 01:51:25 +00:00
|
|
|
but->func_argN = MEM_dupallocN(block->func_argN);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
but->pos = -1; /* cursor invisible */
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
if (ELEM(but->type, UI_BTYPE_NUM, UI_BTYPE_NUM_SLIDER)) { /* add a space to name */
|
2011-10-05 12:20:38 +00:00
|
|
|
/* slen remains unchanged from previous assignment, ensure this stays true */
|
2012-03-30 01:51:25 +00:00
|
|
|
if (slen > 0 && slen < UI_MAX_NAME_STR - 2) {
|
|
|
|
if (but->str[slen - 1] != ' ') {
|
|
|
|
but->str[slen] = ' ';
|
|
|
|
but->str[slen + 1] = 0;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
Pie Menus C code backend.
This commit merges the code in the pie-menu branch.
As per decisions taken the last few days, there are no pie menus
included and there will be an official add-on including overrides of
some keys with pie menus. However, people will now be able to use the
new code in python.
Full Documentation is in http://wiki.blender.org/index.php/Dev:Ref/
Thanks:
Campbell Barton, Dalai Felinto and Ton Roosendaal for the code review
and design comments
Jonathan Williamson, Pawel Lyczkowski, Pablo Vazquez among others for
suggestions during the development.
Special Thanks to Sean Olson, for his support, suggestions, testing and
merciless bugging so that I would finish the pie menu code. Without him
we wouldn't be here. Also to the rest of the developers of the original
python add-on, Patrick Moore and Dan Eicher and finally to Matt Ebb, who
did the research and first implementation and whose code I used to get
started.
2014-08-11 10:39:59 +02:00
|
|
|
if (block->flag & UI_BLOCK_RADIAL) {
|
2017-06-20 20:15:04 +03:00
|
|
|
but->drawflag |= UI_BUT_TEXT_LEFT;
|
2018-10-11 17:42:50 +02:00
|
|
|
if (but->str && but->str[0]) {
|
2017-06-20 20:15:04 +03:00
|
|
|
but->drawflag |= UI_BUT_ICON_LEFT;
|
2018-10-11 17:42:50 +02:00
|
|
|
}
|
Pie Menus C code backend.
This commit merges the code in the pie-menu branch.
As per decisions taken the last few days, there are no pie menus
included and there will be an official add-on including overrides of
some keys with pie menus. However, people will now be able to use the
new code in python.
Full Documentation is in http://wiki.blender.org/index.php/Dev:Ref/
Thanks:
Campbell Barton, Dalai Felinto and Ton Roosendaal for the code review
and design comments
Jonathan Williamson, Pawel Lyczkowski, Pablo Vazquez among others for
suggestions during the development.
Special Thanks to Sean Olson, for his support, suggestions, testing and
merciless bugging so that I would finish the pie menu code. Without him
we wouldn't be here. Also to the rest of the developers of the original
python add-on, Patrick Moore and Dan Eicher and finally to Matt Ebb, who
did the research and first implementation and whose code I used to get
started.
2014-08-11 10:39:59 +02:00
|
|
|
}
|
2018-10-12 15:55:36 +02:00
|
|
|
else if (((block->flag & UI_BLOCK_LOOP) && !ui_block_is_popover(block)) ||
|
2014-11-11 16:52:03 +01:00
|
|
|
ELEM(but->type,
|
|
|
|
UI_BTYPE_MENU, UI_BTYPE_TEXT, UI_BTYPE_LABEL,
|
|
|
|
UI_BTYPE_BLOCK, UI_BTYPE_BUT_MENU, UI_BTYPE_SEARCH_MENU,
|
2018-05-05 14:47:50 +02:00
|
|
|
UI_BTYPE_PROGRESS_BAR, UI_BTYPE_POPOVER))
|
2012-07-21 16:21:42 +00:00
|
|
|
{
|
2018-10-12 15:55:36 +02:00
|
|
|
but->drawflag |= (UI_BUT_TEXT_LEFT | UI_BUT_ICON_LEFT);
|
2012-07-21 16:21:42 +00:00
|
|
|
}
|
2013-12-11 21:27:13 +11:00
|
|
|
#ifdef USE_NUMBUTS_LR_ALIGN
|
2014-11-09 21:20:40 +01:00
|
|
|
else if (ELEM(but->type, UI_BTYPE_NUM, UI_BTYPE_NUM_SLIDER)) {
|
2018-05-27 21:06:38 +02:00
|
|
|
if (slen != 0) {
|
|
|
|
but->drawflag |= UI_BUT_TEXT_LEFT;
|
|
|
|
}
|
2013-12-11 21:27:13 +11:00
|
|
|
}
|
|
|
|
#endif
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2013-11-21 14:43:08 +01:00
|
|
|
but->drawflag |= (block->flag & UI_BUT_ALIGN);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2016-09-19 01:49:17 +02:00
|
|
|
if (block->lock == true) {
|
|
|
|
but->flag |= UI_BUT_DISABLED;
|
2009-01-04 07:50:41 +00:00
|
|
|
}
|
|
|
|
|
2011-01-19 14:19:20 +00:00
|
|
|
/* keep track of UI_interface.h */
|
2014-11-09 21:20:40 +01:00
|
|
|
if (ELEM(but->type,
|
|
|
|
UI_BTYPE_BLOCK, UI_BTYPE_BUT, UI_BTYPE_LABEL,
|
|
|
|
UI_BTYPE_PULLDOWN, UI_BTYPE_ROUNDBOX, UI_BTYPE_LISTBOX,
|
|
|
|
UI_BTYPE_BUT_MENU, UI_BTYPE_SCROLL, UI_BTYPE_GRIP,
|
2018-06-11 14:33:53 +02:00
|
|
|
UI_BTYPE_SEPR, UI_BTYPE_SEPR_LINE,
|
|
|
|
UI_BTYPE_SEPR_SPACER) ||
|
2014-11-09 21:20:40 +01:00
|
|
|
(but->type >= UI_BTYPE_SEARCH_MENU))
|
|
|
|
{
|
|
|
|
/* pass */
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
but->flag |= UI_BUT_UNDO;
|
|
|
|
}
|
2009-09-14 20:48:05 +00:00
|
|
|
|
2009-05-21 15:34:09 +00:00
|
|
|
BLI_addtail(&block->buttons, but);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (block->curlayout)
|
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
|
|
|
ui_layout_add_but(block->curlayout, but);
|
2009-04-11 01:52:27 +00:00
|
|
|
|
2011-11-03 23:20:54 +00:00
|
|
|
#ifdef WITH_PYTHON
|
2011-10-23 04:13:56 +00:00
|
|
|
/* if the 'UI_OT_editsource' is running, extract the source info from the button */
|
|
|
|
if (UI_editsource_enable_check()) {
|
|
|
|
UI_editsource_active_but_test(but);
|
2011-10-20 00:48:00 +00:00
|
|
|
}
|
2011-11-03 23:20:54 +00:00
|
|
|
#endif
|
2011-10-20 00:48:00 +00:00
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
return but;
|
|
|
|
}
|
|
|
|
|
2015-05-13 06:10:49 +10:00
|
|
|
void ui_def_but_icon(uiBut *but, const int icon, const int flag)
|
|
|
|
{
|
2015-05-11 16:29:12 +02:00
|
|
|
if (icon) {
|
|
|
|
ui_icon_ensure_deferred(but->block->evil_C, icon, (flag & UI_BUT_ICON_PREVIEW) != 0);
|
|
|
|
}
|
|
|
|
but->icon = (BIFIconID)icon;
|
|
|
|
but->flag |= flag;
|
|
|
|
|
|
|
|
if (but->str && but->str[0]) {
|
|
|
|
but->drawflag |= UI_BUT_ICON_LEFT;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2016-09-22 00:10:53 +02:00
|
|
|
static void ui_def_but_rna__disable(uiBut *but, const char *info)
|
2013-02-12 00:35:31 +00:00
|
|
|
{
|
|
|
|
but->flag |= UI_BUT_DISABLED;
|
2016-09-22 00:10:53 +02:00
|
|
|
but->disabled_info = info;
|
2013-02-12 00:35:31 +00:00
|
|
|
}
|
|
|
|
|
2014-02-10 12:52:35 +11:00
|
|
|
static void ui_def_but_rna__menu(bContext *UNUSED(C), uiLayout *layout, void *but_p)
|
|
|
|
{
|
|
|
|
uiBlock *block = uiLayoutGetBlock(layout);
|
|
|
|
uiPopupBlockHandle *handle = block->handle;
|
|
|
|
uiBut *but = (uiBut *)but_p;
|
|
|
|
|
|
|
|
/* see comment in ui_item_enum_expand, re: uiname */
|
2017-10-18 15:07:26 +11:00
|
|
|
const EnumPropertyItem *item, *item_array;
|
2014-02-10 12:52:35 +11:00
|
|
|
bool free;
|
|
|
|
|
|
|
|
uiLayout *split, *column = NULL;
|
|
|
|
|
|
|
|
int totitems = 0;
|
|
|
|
int columns, rows, a, b;
|
2019-03-05 14:35:52 +01:00
|
|
|
int column_end = 0;
|
2014-02-10 12:52:35 +11:00
|
|
|
int nbr_entries_nosepr = 0;
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
UI_block_flag_enable(block, UI_BLOCK_MOVEMOUSE_QUIT);
|
2014-02-10 12:52:35 +11:00
|
|
|
|
|
|
|
RNA_property_enum_items_gettexted(block->evil_C, &but->rnapoin, but->rnaprop, &item_array, NULL, &free);
|
|
|
|
|
|
|
|
|
|
|
|
/* we dont want nested rows, cols in menus */
|
2014-11-09 21:20:40 +01:00
|
|
|
UI_block_layout_set_current(block, layout);
|
2014-02-10 12:52:35 +11:00
|
|
|
|
|
|
|
for (item = item_array; item->identifier; item++, totitems++) {
|
|
|
|
if (!item->identifier[0]) {
|
2019-03-05 14:35:52 +01:00
|
|
|
/* inconsistent, but menus with categories do not look good flipped */
|
2014-02-10 12:52:35 +11:00
|
|
|
if (item->name) {
|
|
|
|
block->flag |= UI_BLOCK_NO_FLIP;
|
|
|
|
nbr_entries_nosepr++;
|
|
|
|
}
|
|
|
|
/* We do not want simple separators in nbr_entries_nosepr count */
|
|
|
|
continue;
|
|
|
|
}
|
|
|
|
nbr_entries_nosepr++;
|
|
|
|
}
|
|
|
|
|
|
|
|
/* Columns and row estimation. Ignore simple separators here. */
|
|
|
|
columns = (nbr_entries_nosepr + 20) / 20;
|
|
|
|
if (columns < 1)
|
|
|
|
columns = 1;
|
|
|
|
if (columns > 8)
|
|
|
|
columns = (nbr_entries_nosepr + 25) / 25;
|
|
|
|
|
|
|
|
rows = totitems / columns;
|
|
|
|
if (rows < 1)
|
|
|
|
rows = 1;
|
|
|
|
while (rows * columns < totitems)
|
|
|
|
rows++;
|
|
|
|
|
2019-03-05 14:35:52 +01:00
|
|
|
if (block->flag & UI_BLOCK_NO_FLIP) {
|
|
|
|
/* Title at the top for menus with categories. */
|
|
|
|
uiDefBut(block, UI_BTYPE_LABEL, 0, RNA_property_ui_name(but->rnaprop),
|
|
|
|
0, 0, UI_UNIT_X * 5, UI_UNIT_Y, NULL, 0.0, 0.0, 0, 0, "");
|
|
|
|
uiItemS(layout);
|
|
|
|
}
|
2014-02-10 12:52:35 +11:00
|
|
|
|
|
|
|
/* note, item_array[...] is reversed on access */
|
|
|
|
|
|
|
|
/* create items */
|
|
|
|
split = uiLayoutSplit(layout, 0.0f, false);
|
|
|
|
|
|
|
|
for (a = 0; a < totitems; a++) {
|
|
|
|
if (a == column_end) {
|
|
|
|
/* start new column, and find out where it ends in advance, so we
|
|
|
|
* can flip the order of items properly per column */
|
|
|
|
column_end = totitems;
|
|
|
|
|
|
|
|
for (b = a + 1; b < totitems; b++) {
|
2014-04-20 22:05:05 +10:00
|
|
|
item = &item_array[b];
|
2014-02-10 12:52:35 +11:00
|
|
|
|
|
|
|
/* new column on N rows or on separation label */
|
|
|
|
if (((b - a) % rows == 0) || (!item->identifier[0] && item->name)) {
|
|
|
|
column_end = b;
|
|
|
|
break;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
column = uiLayoutColumn(split, false);
|
|
|
|
}
|
|
|
|
|
2019-03-05 14:35:52 +01:00
|
|
|
item = &item_array[a];
|
2014-02-10 12:52:35 +11:00
|
|
|
|
|
|
|
if (!item->identifier[0]) {
|
|
|
|
if (item->name) {
|
|
|
|
if (item->icon) {
|
|
|
|
uiItemL(column, item->name, item->icon);
|
|
|
|
}
|
|
|
|
else {
|
2019-01-15 23:24:20 +11:00
|
|
|
/* Do not use uiItemL here, as our root layout is a menu one,
|
|
|
|
* it will add a fake blank icon! */
|
2014-11-09 21:20:40 +01:00
|
|
|
uiDefBut(block, UI_BTYPE_LABEL, 0, item->name, 0, 0, UI_UNIT_X * 5, UI_UNIT_Y, NULL, 0.0, 0.0, 0, 0, "");
|
2014-02-10 12:52:35 +11:00
|
|
|
}
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
uiItemS(column);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
if (item->icon) {
|
2018-07-01 19:57:31 +02:00
|
|
|
uiDefIconTextButI(
|
|
|
|
block, UI_BTYPE_BUT_MENU, B_NOP, item->icon, item->name, 0, 0,
|
|
|
|
UI_UNIT_X * 5, UI_UNIT_Y, &handle->retvalue, item->value, 0.0, 0, -1, item->description);
|
2014-02-10 12:52:35 +11:00
|
|
|
}
|
|
|
|
else {
|
2018-07-01 19:57:31 +02:00
|
|
|
uiDefButI(
|
|
|
|
block, UI_BTYPE_BUT_MENU, B_NOP, item->name, 0, 0,
|
|
|
|
UI_UNIT_X * 5, UI_UNIT_X, &handle->retvalue, item->value, 0.0, 0, -1, item->description);
|
2014-02-10 12:52:35 +11:00
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2019-03-05 14:35:52 +01:00
|
|
|
if (!(block->flag & UI_BLOCK_NO_FLIP)) {
|
|
|
|
/* Title at the bottom for menus without categories. */
|
|
|
|
uiItemS(layout);
|
|
|
|
uiDefBut(block, UI_BTYPE_LABEL, 0, RNA_property_ui_name(but->rnaprop),
|
|
|
|
0, 0, UI_UNIT_X * 5, UI_UNIT_Y, NULL, 0.0, 0.0, 0, 0, "");
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
UI_block_layout_set_current(block, layout);
|
2014-02-10 12:52:35 +11:00
|
|
|
|
|
|
|
if (free) {
|
2017-10-18 15:07:26 +11:00
|
|
|
MEM_freeN((void *)item_array);
|
2014-02-10 12:52:35 +11:00
|
|
|
}
|
2014-11-15 14:32:23 +01:00
|
|
|
BLI_assert((block->flag & UI_BLOCK_IS_FLIP) == 0);
|
|
|
|
block->flag |= UI_BLOCK_IS_FLIP;
|
2014-02-10 12:52:35 +11:00
|
|
|
}
|
|
|
|
|
2018-07-06 19:26:12 +02:00
|
|
|
static void ui_but_submenu_enable(uiBlock *block, uiBut *but)
|
|
|
|
{
|
|
|
|
but->flag |= UI_BUT_ICON_SUBMENU;
|
2018-09-11 15:08:08 +10:00
|
|
|
block->content_hints |= UI_BLOCK_CONTAINS_SUBMENU_BUT;
|
2018-07-06 19:26:12 +02:00
|
|
|
}
|
|
|
|
|
2013-02-12 00:35:31 +00:00
|
|
|
/**
|
|
|
|
* ui_def_but_rna_propname and ui_def_but_rna
|
2011-08-06 14:57:55 +00:00
|
|
|
* both take the same args except for propname vs prop, this is done so we can
|
|
|
|
* avoid an extra lookup on 'prop' when its already available.
|
|
|
|
*
|
|
|
|
* When this kind of change won't disrupt branches, best look into making more
|
|
|
|
* of our UI functions take prop rather then propname.
|
|
|
|
*/
|
2015-05-05 03:13:47 +10:00
|
|
|
static uiBut *ui_def_but_rna(
|
|
|
|
uiBlock *block, int type, int retval, const char *str,
|
|
|
|
int x, int y, short width, short height,
|
|
|
|
PointerRNA *ptr, PropertyRNA *prop, int index,
|
|
|
|
float min, float max, float a1, float a2, const char *tip)
|
2008-12-16 07:55:43 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
const PropertyType proptype = RNA_property_type(prop);
|
2008-12-16 07:55:43 +00:00
|
|
|
uiBut *but;
|
2014-02-13 09:49:00 +11:00
|
|
|
int icon = 0;
|
2014-02-10 12:52:35 +11:00
|
|
|
uiMenuCreateFunc func = NULL;
|
2008-12-16 07:55:43 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
if (ELEM(type, UI_BTYPE_COLOR, UI_BTYPE_HSVCIRCLE, UI_BTYPE_HSVCUBE)) {
|
2012-12-18 17:00:49 +00:00
|
|
|
BLI_assert(index == -1);
|
|
|
|
}
|
|
|
|
|
2011-08-06 14:57:55 +00:00
|
|
|
/* use rna values if parameters are not specified */
|
2014-11-09 21:20:40 +01:00
|
|
|
if ((proptype == PROP_ENUM) && ELEM(type, UI_BTYPE_MENU, UI_BTYPE_ROW, UI_BTYPE_LISTROW)) {
|
|
|
|
/* UI_BTYPE_MENU is handled a little differently here */
|
2017-10-18 15:07:26 +11:00
|
|
|
const EnumPropertyItem *item;
|
2014-03-12 19:24:47 +11:00
|
|
|
int value;
|
|
|
|
bool free;
|
|
|
|
int i;
|
2011-08-06 14:57:55 +00:00
|
|
|
|
2014-03-12 19:24:47 +11:00
|
|
|
RNA_property_enum_items(block->evil_C, ptr, prop, &item, NULL, &free);
|
2014-03-12 19:11:09 +11:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
if (type == UI_BTYPE_MENU) {
|
2014-03-12 19:24:47 +11:00
|
|
|
value = RNA_property_enum_get(ptr, prop);
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
value = (int)max;
|
|
|
|
}
|
2008-12-16 07:55:43 +00:00
|
|
|
|
2014-03-12 19:24:47 +11:00
|
|
|
i = RNA_enum_from_value(item, value);
|
|
|
|
if (i != -1) {
|
RNA
* Enums can now be dynamically created in the _itemf callback,
using RNA_enum_item(s)_add, RNA_enum_item_end. All places asking
for enum items now need to potentially free the items.
* This callback now also gets context, this was added specifically
for operators. This doesn't fit design well at all, needed to do
some ugly hacks, but can't find a good solution at the moment.
* All enums must have a default list of items too, even with an
_itemf callback, for docs and fallback in case there is no context.
* Used by MESH_OT_merge, MESH_OT_select_similar, TFM_OT_select_orientation.
* Also changes some operator properties that were enums to booleas
(unselected, deselect), to make them consistent with other ops.
2009-07-10 19:56:13 +00:00
|
|
|
|
2014-03-12 19:24:47 +11:00
|
|
|
if (!str) {
|
|
|
|
str = item[i].name;
|
2014-02-10 12:52:35 +11:00
|
|
|
#ifdef WITH_INTERNATIONAL
|
2014-03-12 19:11:09 +11:00
|
|
|
str = CTX_IFACE_(RNA_property_translation_context(prop), str);
|
2014-02-10 12:52:35 +11:00
|
|
|
#endif
|
2014-03-12 19:11:09 +11:00
|
|
|
}
|
2014-03-12 19:24:47 +11:00
|
|
|
|
|
|
|
icon = item[i].icon;
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
if (!str) {
|
2014-11-09 21:20:40 +01:00
|
|
|
if (type == UI_BTYPE_MENU) {
|
2014-03-12 19:11:09 +11:00
|
|
|
str = "";
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
str = RNA_property_ui_name(prop);
|
2011-08-06 14:57:55 +00:00
|
|
|
}
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
}
|
2014-03-12 19:24:47 +11:00
|
|
|
}
|
2011-08-06 14:57:55 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
if (type == UI_BTYPE_MENU) {
|
2014-03-12 19:24:47 +11:00
|
|
|
func = ui_def_but_rna__menu;
|
|
|
|
}
|
2014-03-12 19:11:09 +11:00
|
|
|
|
2014-03-12 19:24:47 +11:00
|
|
|
if (free) {
|
2017-10-18 15:07:26 +11:00
|
|
|
MEM_freeN((void *)item);
|
2008-12-16 07:55:43 +00:00
|
|
|
}
|
2014-03-12 19:24:47 +11:00
|
|
|
}
|
|
|
|
else {
|
|
|
|
if (!str) {
|
2012-03-30 01:51:25 +00:00
|
|
|
str = RNA_property_ui_name(prop);
|
2011-08-06 14:57:55 +00:00
|
|
|
}
|
2014-03-12 19:24:47 +11:00
|
|
|
icon = RNA_property_ui_icon(prop);
|
2011-08-06 14:57:55 +00:00
|
|
|
}
|
2009-01-15 04:13:38 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (!tip && proptype != PROP_ENUM)
|
2012-03-30 01:51:25 +00:00
|
|
|
tip = RNA_property_ui_description(prop);
|
2008-12-16 07:55:43 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (min == max || a1 == -1 || a2 == -1) {
|
|
|
|
if (proptype == PROP_INT) {
|
2011-08-06 14:57:55 +00:00
|
|
|
int hardmin, hardmax, softmin, softmax, step;
|
2008-12-16 07:55:43 +00:00
|
|
|
|
2011-08-06 14:57:55 +00:00
|
|
|
RNA_property_int_range(ptr, prop, &hardmin, &hardmax);
|
|
|
|
RNA_property_int_ui_range(ptr, prop, &softmin, &softmax, &step);
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
if (!ELEM(type, UI_BTYPE_ROW, UI_BTYPE_LISTROW) && min == max) {
|
2012-03-30 01:51:25 +00:00
|
|
|
min = hardmin;
|
|
|
|
max = hardmax;
|
2009-01-15 04:13:38 +00:00
|
|
|
}
|
2012-03-24 06:38:07 +00:00
|
|
|
if (a1 == -1)
|
2012-03-30 01:51:25 +00:00
|
|
|
a1 = step;
|
2012-03-24 06:38:07 +00:00
|
|
|
if (a2 == -1)
|
2012-03-30 01:51:25 +00:00
|
|
|
a2 = 0;
|
2011-08-06 14:57:55 +00:00
|
|
|
}
|
2012-03-24 06:38:07 +00:00
|
|
|
else if (proptype == PROP_FLOAT) {
|
2011-08-06 14:57:55 +00:00
|
|
|
float hardmin, hardmax, softmin, softmax, step, precision;
|
2009-01-15 04:13:38 +00:00
|
|
|
|
2011-08-06 14:57:55 +00:00
|
|
|
RNA_property_float_range(ptr, prop, &hardmin, &hardmax);
|
|
|
|
RNA_property_float_ui_range(ptr, prop, &softmin, &softmax, &step, &precision);
|
2009-01-15 04:13:38 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
if (!ELEM(type, UI_BTYPE_ROW, UI_BTYPE_LISTROW) && min == max) {
|
2012-03-30 01:51:25 +00:00
|
|
|
min = hardmin;
|
|
|
|
max = hardmax;
|
2008-12-16 07:55:43 +00:00
|
|
|
}
|
2012-03-24 06:38:07 +00:00
|
|
|
if (a1 == -1)
|
2012-03-30 01:51:25 +00:00
|
|
|
a1 = step;
|
2012-03-24 06:38:07 +00:00
|
|
|
if (a2 == -1)
|
2012-03-30 01:51:25 +00:00
|
|
|
a2 = precision;
|
2011-08-06 14:57:55 +00:00
|
|
|
}
|
2012-03-24 06:38:07 +00:00
|
|
|
else if (proptype == PROP_STRING) {
|
2012-03-30 01:51:25 +00:00
|
|
|
min = 0;
|
|
|
|
max = RNA_property_string_maxlength(prop);
|
2015-11-10 08:35:12 +11:00
|
|
|
/* note, 'max' may be zero (code for dynamically resized array) */
|
2008-12-16 07:55:43 +00:00
|
|
|
}
|
2010-08-23 05:36:21 +00:00
|
|
|
}
|
2008-12-16 07:55:43 +00:00
|
|
|
|
|
|
|
/* now create button */
|
2012-08-18 18:11:51 +00:00
|
|
|
but = ui_def_but(block, type, retval, str, x, y, width, height, NULL, min, max, a1, a2, tip);
|
2009-01-15 04:13:38 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
but->rnapoin = *ptr;
|
|
|
|
but->rnaprop = prop;
|
2009-03-25 20:49:15 +00:00
|
|
|
|
2013-09-16 01:35:52 +00:00
|
|
|
if (RNA_property_array_check(but->rnaprop))
|
2012-03-30 01:51:25 +00:00
|
|
|
but->rnaindex = index;
|
2011-08-06 14:57:55 +00:00
|
|
|
else
|
2012-03-30 01:51:25 +00:00
|
|
|
but->rnaindex = 0;
|
2009-06-16 01:08:39 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (icon) {
|
2015-05-11 16:29:12 +02:00
|
|
|
ui_def_but_icon(but, icon, UI_HAS_ICON);
|
2009-01-15 04:13:38 +00:00
|
|
|
}
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2018-12-03 18:43:33 +11:00
|
|
|
if (type == UI_BTYPE_MENU) {
|
|
|
|
if (but->dt == UI_EMBOSS_PULLDOWN) {
|
2018-12-03 19:31:54 +11:00
|
|
|
ui_but_submenu_enable(block, but);
|
2018-12-03 18:43:33 +11:00
|
|
|
}
|
|
|
|
}
|
|
|
|
else if (type == UI_BTYPE_SEARCH_MENU) {
|
|
|
|
if (proptype == PROP_POINTER) {
|
|
|
|
/* Search buttons normally don't get undo, see: T54580. */
|
|
|
|
but->flag |= UI_BUT_UNDO;
|
|
|
|
}
|
2014-02-20 13:30:52 +11:00
|
|
|
}
|
|
|
|
|
2016-09-22 00:10:53 +02:00
|
|
|
const char *info;
|
2017-05-12 01:42:42 +02:00
|
|
|
if (but->rnapoin.data && !RNA_property_editable_info(&but->rnapoin, prop, &info)) {
|
2016-09-22 00:10:53 +02:00
|
|
|
ui_def_but_rna__disable(but, info);
|
2009-01-04 02:09:41 +00:00
|
|
|
}
|
2008-12-16 07:55:43 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
if (but->flag & UI_BUT_UNDO && (ui_but_is_rna_undo(but) == false)) {
|
2011-08-18 19:07:37 +00:00
|
|
|
but->flag &= ~UI_BUT_UNDO;
|
2011-08-18 16:26:34 +00:00
|
|
|
}
|
|
|
|
|
2009-08-12 08:16:10 +00:00
|
|
|
/* If this button uses units, calculate the step from this */
|
2014-11-09 21:20:40 +01:00
|
|
|
if ((proptype == PROP_FLOAT) && ui_but_is_unit(but)) {
|
2012-03-30 01:51:25 +00:00
|
|
|
but->a1 = ui_get_but_step_unit(but, but->a1);
|
2011-08-18 19:07:37 +00:00
|
|
|
}
|
2009-08-12 08:16:10 +00:00
|
|
|
|
2014-02-10 12:52:35 +11:00
|
|
|
if (func) {
|
|
|
|
but->menu_create_func = func;
|
|
|
|
but->poin = (char *)but;
|
|
|
|
}
|
|
|
|
|
2008-12-16 07:55:43 +00:00
|
|
|
return but;
|
|
|
|
}
|
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
static uiBut *ui_def_but_rna_propname(uiBlock *block, int type, int retval, const char *str, int x, int y, short width, short height, PointerRNA *ptr, const char *propname, int index, float min, float max, float a1, float a2, const char *tip)
|
2011-08-06 14:57:55 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
PropertyRNA *prop = RNA_struct_find_property(ptr, propname);
|
2011-08-06 14:57:55 +00:00
|
|
|
uiBut *but;
|
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (prop) {
|
2012-08-18 18:11:51 +00:00
|
|
|
but = ui_def_but_rna(block, type, retval, str, x, y, width, height, ptr, prop, index, min, max, a1, a2, tip);
|
2011-08-06 14:57:55 +00:00
|
|
|
}
|
|
|
|
else {
|
2012-08-18 18:11:51 +00:00
|
|
|
but = ui_def_but(block, type, retval, propname, x, y, width, height, NULL, min, max, a1, a2, tip);
|
2011-08-06 14:57:55 +00:00
|
|
|
|
2016-09-22 00:10:53 +02:00
|
|
|
ui_def_but_rna__disable(but, "Unknown Property.");
|
2011-08-06 14:57:55 +00:00
|
|
|
}
|
|
|
|
|
|
|
|
return but;
|
|
|
|
}
|
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
static uiBut *ui_def_but_operator_ptr(uiBlock *block, int type, wmOperatorType *ot, int opcontext, const char *str, int x, int y, short width, short height, const char *tip)
|
2008-12-16 07:55:43 +00:00
|
|
|
{
|
|
|
|
uiBut *but;
|
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (!str) {
|
2012-03-16 15:39:25 +00:00
|
|
|
if (ot && ot->srna)
|
|
|
|
str = RNA_struct_ui_name(ot->srna);
|
|
|
|
else
|
|
|
|
str = "";
|
2008-12-16 07:55:43 +00:00
|
|
|
}
|
2011-09-15 13:20:18 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
if ((!tip || tip[0] == '\0') && ot && ot->srna) {
|
2012-03-16 15:39:25 +00:00
|
|
|
tip = RNA_struct_ui_description(ot->srna);
|
2009-02-12 03:39:56 +00:00
|
|
|
}
|
2008-12-16 07:55:43 +00:00
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
but = ui_def_but(block, type, -1, str, x, y, width, height, NULL, 0, 0, 0, 0, tip);
|
2012-03-30 01:51:25 +00:00
|
|
|
but->optype = ot;
|
|
|
|
but->opcontext = opcontext;
|
2014-11-09 21:20:40 +01:00
|
|
|
but->flag &= ~UI_BUT_UNDO; /* no need for ui_but_is_rna_undo(), we never need undo here */
|
2008-12-16 07:55:43 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (!ot) {
|
2009-01-15 04:13:38 +00:00
|
|
|
but->flag |= UI_BUT_DISABLED;
|
2016-09-19 02:25:59 +02:00
|
|
|
but->disabled_info = "";
|
2009-01-15 04:13:38 +00:00
|
|
|
}
|
2008-12-16 07:55:43 +00:00
|
|
|
|
|
|
|
return but;
|
|
|
|
}
|
2012-03-19 22:21:40 +00:00
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefBut(uiBlock *block, int type, int retval, const char *str, int x, int y, short width, short height, void *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *but = ui_def_but(block, type, retval, str, x, y, width, height, poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update(but);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
return but;
|
|
|
|
}
|
|
|
|
|
2013-10-21 23:35:08 +00:00
|
|
|
/**
|
|
|
|
* if \a _x_ is a power of two (only one bit) return the power,
|
2012-03-30 01:51:25 +00:00
|
|
|
* otherwise return -1.
|
2013-10-21 23:35:08 +00:00
|
|
|
*
|
|
|
|
* for powers of two:
|
|
|
|
* \code{.c}
|
|
|
|
* ((1 << findBitIndex(x)) == x);
|
|
|
|
* \endcode
|
2012-03-30 01:51:25 +00:00
|
|
|
*/
|
2019-01-04 11:05:53 +11:00
|
|
|
static int findBitIndex(uint x)
|
2011-09-28 05:53:40 +00:00
|
|
|
{
|
2011-12-16 09:25:07 +00:00
|
|
|
if (!x || !is_power_of_2_i(x)) { /* is_power_of_2_i(x) strips lowest bit */
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
return -1;
|
2012-03-24 06:38:07 +00:00
|
|
|
}
|
|
|
|
else {
|
2012-03-30 01:51:25 +00:00
|
|
|
int idx = 0;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2016-03-05 09:09:05 +11:00
|
|
|
if (x & 0xFFFF0000) { idx += 16; x >>= 16; }
|
|
|
|
if (x & 0xFF00) { idx += 8; x >>= 8; }
|
|
|
|
if (x & 0xF0) { idx += 4; x >>= 4; }
|
|
|
|
if (x & 0xC) { idx += 2; x >>= 2; }
|
|
|
|
if (x & 0x2) { idx += 1; }
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
return idx;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
/* autocomplete helper functions */
|
|
|
|
struct AutoComplete {
|
2011-09-26 17:30:56 +00:00
|
|
|
size_t maxlen;
|
2013-11-19 16:31:48 +01:00
|
|
|
int matches;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
char *truncate;
|
2010-12-03 17:05:21 +00:00
|
|
|
const char *startname;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
};
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
AutoComplete *UI_autocomplete_begin(const char *startname, size_t maxlen)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
|
|
|
AutoComplete *autocpl;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
autocpl = MEM_callocN(sizeof(AutoComplete), "AutoComplete");
|
|
|
|
autocpl->maxlen = maxlen;
|
2013-11-19 16:31:48 +01:00
|
|
|
autocpl->matches = 0;
|
2012-03-30 01:51:25 +00:00
|
|
|
autocpl->truncate = MEM_callocN(sizeof(char) * maxlen, "AutoCompleteTruncate");
|
|
|
|
autocpl->startname = startname;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
return autocpl;
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_autocomplete_update_name(AutoComplete *autocpl, const char *name)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
char *truncate = autocpl->truncate;
|
|
|
|
const char *startname = autocpl->startname;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
int a;
|
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
for (a = 0; a < autocpl->maxlen - 1; a++) {
|
|
|
|
if (startname[a] == 0 || startname[a] != name[a])
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
break;
|
|
|
|
}
|
|
|
|
/* found a match */
|
2012-03-30 01:51:25 +00:00
|
|
|
if (startname[a] == 0) {
|
2013-11-19 16:31:48 +01:00
|
|
|
autocpl->matches++;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
/* first match */
|
2012-03-30 01:51:25 +00:00
|
|
|
if (truncate[0] == 0)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
BLI_strncpy(truncate, name, autocpl->maxlen);
|
|
|
|
else {
|
|
|
|
/* remove from truncate what is not in bone->name */
|
2012-03-30 01:51:25 +00:00
|
|
|
for (a = 0; a < autocpl->maxlen - 1; a++) {
|
2012-03-24 06:38:07 +00:00
|
|
|
if (name[a] == 0) {
|
2012-03-30 01:51:25 +00:00
|
|
|
truncate[a] = 0;
|
2009-06-27 01:15:31 +00:00
|
|
|
break;
|
|
|
|
}
|
2012-03-30 01:51:25 +00:00
|
|
|
else if (truncate[a] != name[a])
|
|
|
|
truncate[a] = 0;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
int UI_autocomplete_end(AutoComplete *autocpl, char *autoname)
|
2018-05-23 10:47:12 +02:00
|
|
|
{
|
2013-11-19 16:31:48 +01:00
|
|
|
int match = AUTOCOMPLETE_NO_MATCH;
|
2013-06-19 11:53:48 +00:00
|
|
|
if (autocpl->truncate[0]) {
|
2013-11-19 16:31:48 +01:00
|
|
|
if (autocpl->matches == 1) {
|
|
|
|
match = AUTOCOMPLETE_FULL_MATCH;
|
2013-11-20 03:38:18 +11:00
|
|
|
}
|
|
|
|
else {
|
2013-11-19 16:31:48 +01:00
|
|
|
match = AUTOCOMPLETE_PARTIAL_MATCH;
|
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
BLI_strncpy(autoname, autocpl->truncate, autocpl->maxlen);
|
2013-06-19 11:53:48 +00:00
|
|
|
}
|
2009-03-17 21:44:58 +00:00
|
|
|
else {
|
2013-06-19 11:53:48 +00:00
|
|
|
if (autoname != autocpl->startname) { /* don't copy a string over its self */
|
2009-03-17 21:44:58 +00:00
|
|
|
BLI_strncpy(autoname, autocpl->startname, autocpl->maxlen);
|
2013-06-19 11:53:48 +00:00
|
|
|
}
|
2009-03-17 21:44:58 +00:00
|
|
|
}
|
2013-11-19 16:31:48 +01:00
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
MEM_freeN(autocpl->truncate);
|
|
|
|
MEM_freeN(autocpl);
|
2013-11-19 16:31:48 +01:00
|
|
|
return match;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
static void ui_but_update_and_icon_set(uiBut *but, int icon)
|
2011-08-06 16:00:00 +00:00
|
|
|
{
|
2012-03-24 06:38:07 +00:00
|
|
|
if (icon) {
|
2015-05-11 16:29:12 +02:00
|
|
|
ui_def_but_icon(but, icon, UI_HAS_ICON);
|
2011-08-06 16:00:00 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update(but);
|
2011-08-06 16:00:00 +00:00
|
|
|
}
|
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
static uiBut *uiDefButBit(uiBlock *block, int type, int bit, int retval, const char *str, int x, int y, short width, short height, void *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
int bitIdx = findBitIndex(bit);
|
|
|
|
if (bitIdx == -1) {
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
return NULL;
|
2012-03-24 06:38:07 +00:00
|
|
|
}
|
|
|
|
else {
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefBut(block, type | UI_BUT_POIN_BIT | bitIdx, retval, str, x, y, width, height, poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefButF(uiBlock *block, int type, int retval, const char *str, int x, int y, short width, short height, float *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefBut(block, type | UI_BUT_POIN_FLOAT, retval, str, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefButBitF(uiBlock *block, int type, int bit, int retval, const char *str, int x, int y, short width, short height, float *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefButBit(block, type | UI_BUT_POIN_FLOAT, bit, retval, str, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefButI(uiBlock *block, int type, int retval, const char *str, int x, int y, short width, short height, int *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefBut(block, type | UI_BUT_POIN_INT, retval, str, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefButBitI(uiBlock *block, int type, int bit, int retval, const char *str, int x, int y, short width, short height, int *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefButBit(block, type | UI_BUT_POIN_INT, bit, retval, str, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefButS(uiBlock *block, int type, int retval, const char *str, int x, int y, short width, short height, short *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefBut(block, type | UI_BUT_POIN_SHORT, retval, str, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefButBitS(uiBlock *block, int type, int bit, int retval, const char *str, int x, int y, short width, short height, short *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefButBit(block, type | UI_BUT_POIN_SHORT, bit, retval, str, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefButC(uiBlock *block, int type, int retval, const char *str, int x, int y, short width, short height, char *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefBut(block, type | UI_BUT_POIN_CHAR, retval, str, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefButBitC(uiBlock *block, int type, int bit, int retval, const char *str, int x, int y, short width, short height, char *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefButBit(block, type | UI_BUT_POIN_CHAR, bit, retval, str, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefButR(uiBlock *block, int type, int retval, const char *str, int x, int y, short width, short height, PointerRNA *ptr, const char *propname, int index, float min, float max, float a1, float a2, const char *tip)
|
2008-12-16 07:55:43 +00:00
|
|
|
{
|
|
|
|
uiBut *but;
|
2012-08-18 18:11:51 +00:00
|
|
|
but = ui_def_but_rna_propname(block, type, retval, str, x, y, width, height, ptr, propname, index, min, max, a1, a2, tip);
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update(but);
|
2011-08-06 16:00:00 +00:00
|
|
|
return but;
|
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefButR_prop(uiBlock *block, int type, int retval, const char *str, int x, int y, short width, short height, PointerRNA *ptr, PropertyRNA *prop, int index, float min, float max, float a1, float a2, const char *tip)
|
2011-08-06 16:00:00 +00:00
|
|
|
{
|
|
|
|
uiBut *but;
|
2012-08-18 18:11:51 +00:00
|
|
|
but = ui_def_but_rna(block, type, retval, str, x, y, width, height, ptr, prop, index, min, max, a1, a2, tip);
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update(but);
|
2008-12-16 07:55:43 +00:00
|
|
|
return but;
|
|
|
|
}
|
2012-01-22 03:30:07 +00:00
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefButO_ptr(uiBlock *block, int type, wmOperatorType *ot, int opcontext, const char *str, int x, int y, short width, short height, const char *tip)
|
2008-12-16 07:55:43 +00:00
|
|
|
{
|
|
|
|
uiBut *but;
|
2012-08-18 18:11:51 +00:00
|
|
|
but = ui_def_but_operator_ptr(block, type, ot, opcontext, str, x, y, width, height, tip);
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update(but);
|
2008-12-16 07:55:43 +00:00
|
|
|
return but;
|
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefButO(uiBlock *block, int type, const char *opname, int opcontext, const char *str, int x, int y, short width, short height, const char *tip)
|
2012-01-22 03:30:07 +00:00
|
|
|
{
|
|
|
|
wmOperatorType *ot = WM_operatortype_find(opname, 0);
|
|
|
|
if (str == NULL && ot == NULL) str = opname;
|
2012-08-18 18:11:51 +00:00
|
|
|
return uiDefButO_ptr(block, type, ot, opcontext, str, x, y, width, height, tip);
|
2012-01-22 03:30:07 +00:00
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
Drag and drop 2.5 integration! Finally, slashdot regulars can use
Blender too now! :)
** Drag works as follows:
- drag-able items are defined by the standard interface ui toolkit
- each button can get this feature, via uiButSetDragXXX(but, ...).
There are calls to define drag-able images, ID blocks, RNA paths,
file paths, and so on. By default you drag an icon, exceptionally
an ImBuf
- Drag items are registered centrally in the WM, it allows more drag
items simultaneous too, but not implemented
** Drop works as follows:
- On mouse release, and if drag items exist in the WM, it converts
the mouse event to an EVT_DROP type. This event then gets the full
drag info as customdata
- drop regions are defined with WM_dropbox_add(), similar to keymaps
you can make a "drop map" this way, which become 'drop map handlers'
in the queues.
- next to that the UI kit handles some common button types (like
accepting ID or names) to be catching a drop event too.
- Every "drop box" has two callbacks:
- poll() = check if the event drag data is relevant for this box
- copy() = fill in custom properties in the dropbox to initialize
an operator
- The dropbox handler then calls its standard Operator with its
dropbox properties.
** Currently implemented
Drag items:
- ID icons in browse buttons
- ID icons in context menu of properties region
- ID icons in outliner and rna viewer
- FileBrowser icons
- FileBrowser preview images
Drag-able icons are subtly visualized by making them brighter a bit
on mouse-over. In case the icon is a button or UI element too (most
cases), the drag-able feature will make the item react to
mouse-release instead of mouse-press.
Drop options:
- UI buttons: ID and text buttons (paste name)
- View3d: Object ID drop copies object
- View3d: Material ID drop assigns to object under cursor
- View3d: Image ID drop assigns to object UV texture under cursor
- Sequencer: Path drop will add either Image or Movie strip
- Image window: Path drop will open image
** Drag and drop Notes:
- Dropping into another Blender window (from same application) works
too. I've added code that passes on mousemoves and clicks to other
windows, without activating them though. This does make using multi-window
Blender a bit friendler.
- Dropping a file path to an image, is not the same as dropping an
Image ID... keep this in mind. Sequencer for example wants paths to
be dropped, textures in 3d window wants an Image ID.
- Although drop boxes could be defined via Python, I suggest they're
part of the UI and editor design (= how we want an editor to work), and
not default offered configurable like keymaps.
- At the moment only one item can be dragged at a time. This is for
several reasons.... For one, Blender doesn't have a well defined
uniform way to define "what is selected" (files, outliner items, etc).
Secondly there's potential conflicts on what todo when you drop mixed
drag sets on spots. All undefined stuff... nice for later.
- Example to bypass the above: a collection of images that form a strip,
should be represented in filewindow as a single sequence anyway.
This then will fit well and gets handled neatly by design.
- Another option to check is to allow multiple options per drop... it
could show the operator as a sort of menu, allowing arrow or scrollwheel
to choose. For time being I'd prefer to try to design a singular drop
though, just offer only one drop action per data type on given spots.
- What does work already, but a tad slow, is to use a function that
detects an object (type) under cursor, so a drag item's option can be
further refined (like drop object on object = parent). (disabled)
** More notes
- Added saving for Region layouts (like split points for toolbar)
- Label buttons now handle mouse over
- File list: added full path entry for drop feature.
- Filesel bugfix: wm_operator_exec() got called there and fully handled,
while WM event code tried same. Added new OPERATOR_HANDLED flag for this.
Maybe python needs it too?
- Cocoa: added window move event, so multi-win setups work OK (didnt save).
- Interface_handlers.c: removed win->active
- Severe area copy bug: area handlers were not set to NULL
- Filesel bugfix: next/prev folder list was not copied on area copies
** Leftover todos
- Cocoa windows seem to hang on cases still... needs check
- Cocoa 'draw overlap' swap doesn't work
- Cocoa window loses focus permanently on using Spotlight
(for these reasons, makefile building has Carbon as default atm)
- ListView templates in UI cannot become dragged yet, needs review...
it consists of two overlapping UI elements, preventing handling icon clicks.
- There's already Ghost library code to handle dropping from OS
into Blender window. I've noticed this code is unfinished for Macs, but
seems to be complete for Windows. Needs test... currently, an external
drop event will print in console when succesfully delivered to Blender's WM.
2010-01-26 18:18:21 +00:00
|
|
|
/* if a1==1.0 then a2 is an extra icon blending factor (alpha 0.0 - 1.0) */
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconBut(uiBlock *block, int type, int retval, int icon, int x, int y, short width, short height, void *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *but = ui_def_but(block, type, retval, "", x, y, width, height, poin, min, max, a1, a2, tip);
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update_and_icon_set(but, icon);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
return but;
|
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
static uiBut *uiDefIconButBit(uiBlock *block, int type, int bit, int retval, int icon, int x, int y, short width, short height, void *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
int bitIdx = findBitIndex(bit);
|
|
|
|
if (bitIdx == -1) {
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
return NULL;
|
2012-03-24 06:38:07 +00:00
|
|
|
}
|
|
|
|
else {
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefIconBut(block, type | UI_BUT_POIN_BIT | bitIdx, retval, icon, x, y, width, height, poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconButF(uiBlock *block, int type, int retval, int icon, int x, int y, short width, short height, float *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefIconBut(block, type | UI_BUT_POIN_FLOAT, retval, icon, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconButBitF(uiBlock *block, int type, int bit, int retval, int icon, int x, int y, short width, short height, float *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefIconButBit(block, type | UI_BUT_POIN_FLOAT, bit, retval, icon, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconButI(uiBlock *block, int type, int retval, int icon, int x, int y, short width, short height, int *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefIconBut(block, type | UI_BUT_POIN_INT, retval, icon, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconButBitI(uiBlock *block, int type, int bit, int retval, int icon, int x, int y, short width, short height, int *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefIconButBit(block, type | UI_BUT_POIN_INT, bit, retval, icon, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconButS(uiBlock *block, int type, int retval, int icon, int x, int y, short width, short height, short *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefIconBut(block, type | UI_BUT_POIN_SHORT, retval, icon, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconButBitS(uiBlock *block, int type, int bit, int retval, int icon, int x, int y, short width, short height, short *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefIconButBit(block, type | UI_BUT_POIN_SHORT, bit, retval, icon, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconButC(uiBlock *block, int type, int retval, int icon, int x, int y, short width, short height, char *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefIconBut(block, type | UI_BUT_POIN_CHAR, retval, icon, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconButBitC(uiBlock *block, int type, int bit, int retval, int icon, int x, int y, short width, short height, char *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefIconButBit(block, type | UI_BUT_POIN_CHAR, bit, retval, icon, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconButR(uiBlock *block, int type, int retval, int icon, int x, int y, short width, short height, PointerRNA *ptr, const char *propname, int index, float min, float max, float a1, float a2, const char *tip)
|
2008-12-16 07:55:43 +00:00
|
|
|
{
|
|
|
|
uiBut *but;
|
2012-08-18 18:11:51 +00:00
|
|
|
but = ui_def_but_rna_propname(block, type, retval, "", x, y, width, height, ptr, propname, index, min, max, a1, a2, tip);
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update_and_icon_set(but, icon);
|
2011-08-06 16:00:00 +00:00
|
|
|
return but;
|
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconButR_prop(uiBlock *block, int type, int retval, int icon, int x, int y, short width, short height, PointerRNA *ptr, PropertyRNA *prop, int index, float min, float max, float a1, float a2, const char *tip)
|
2011-08-06 16:00:00 +00:00
|
|
|
{
|
|
|
|
uiBut *but;
|
2012-08-18 18:11:51 +00:00
|
|
|
but = ui_def_but_rna(block, type, retval, "", x, y, width, height, ptr, prop, index, min, max, a1, a2, tip);
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update_and_icon_set(but, icon);
|
2008-12-16 07:55:43 +00:00
|
|
|
return but;
|
|
|
|
}
|
2012-01-22 03:30:07 +00:00
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconButO_ptr(uiBlock *block, int type, wmOperatorType *ot, int opcontext, int icon, int x, int y, short width, short height, const char *tip)
|
2008-12-16 07:55:43 +00:00
|
|
|
{
|
|
|
|
uiBut *but;
|
2012-08-18 18:11:51 +00:00
|
|
|
but = ui_def_but_operator_ptr(block, type, ot, opcontext, "", x, y, width, height, tip);
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update_and_icon_set(but, icon);
|
2008-12-16 07:55:43 +00:00
|
|
|
return but;
|
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconButO(uiBlock *block, int type, const char *opname, int opcontext, int icon, int x, int y, short width, short height, const char *tip)
|
2012-01-22 03:30:07 +00:00
|
|
|
{
|
|
|
|
wmOperatorType *ot = WM_operatortype_find(opname, 0);
|
2012-08-18 18:11:51 +00:00
|
|
|
return uiDefIconButO_ptr(block, type, ot, opcontext, icon, x, y, width, height, tip);
|
2012-01-22 03:30:07 +00:00
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
/* Button containing both string label and icon */
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconTextBut(uiBlock *block, int type, int retval, int icon, const char *str, int x, int y, short width, short height, void *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *but = ui_def_but(block, type, retval, str, x, y, width, height, poin, min, max, a1, a2, tip);
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update_and_icon_set(but, icon);
|
2013-11-21 14:43:08 +01:00
|
|
|
but->drawflag |= UI_BUT_ICON_LEFT;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
return but;
|
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
static uiBut *uiDefIconTextButBit(uiBlock *block, int type, int bit, int retval, int icon, const char *str, int x, int y, short width, short height, void *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
int bitIdx = findBitIndex(bit);
|
|
|
|
if (bitIdx == -1) {
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
return NULL;
|
2012-03-24 06:38:07 +00:00
|
|
|
}
|
|
|
|
else {
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefIconTextBut(block, type | UI_BUT_POIN_BIT | bitIdx, retval, icon, str, x, y, width, height, poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconTextButF(uiBlock *block, int type, int retval, int icon, const char *str, int x, int y, short width, short height, float *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefIconTextBut(block, type | UI_BUT_POIN_FLOAT, retval, icon, str, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconTextButBitF(uiBlock *block, int type, int bit, int retval, int icon, const char *str, int x, int y, short width, short height, float *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefIconTextButBit(block, type | UI_BUT_POIN_FLOAT, bit, retval, icon, str, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconTextButI(uiBlock *block, int type, int retval, int icon, const char *str, int x, int y, short width, short height, int *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefIconTextBut(block, type | UI_BUT_POIN_INT, retval, icon, str, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconTextButBitI(uiBlock *block, int type, int bit, int retval, int icon, const char *str, int x, int y, short width, short height, int *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefIconTextButBit(block, type | UI_BUT_POIN_INT, bit, retval, icon, str, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconTextButS(uiBlock *block, int type, int retval, int icon, const char *str, int x, int y, short width, short height, short *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefIconTextBut(block, type | UI_BUT_POIN_SHORT, retval, icon, str, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconTextButBitS(uiBlock *block, int type, int bit, int retval, int icon, const char *str, int x, int y, short width, short height, short *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefIconTextButBit(block, type | UI_BUT_POIN_SHORT, bit, retval, icon, str, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconTextButC(uiBlock *block, int type, int retval, int icon, const char *str, int x, int y, short width, short height, char *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefIconTextBut(block, type | UI_BUT_POIN_CHAR, retval, icon, str, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconTextButBitC(uiBlock *block, int type, int bit, int retval, int icon, const char *str, int x, int y, short width, short height, char *poin, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-09-12 00:32:33 +00:00
|
|
|
return uiDefIconTextButBit(block, type | UI_BUT_POIN_CHAR, bit, retval, icon, str, x, y, width, height, (void *) poin, min, max, a1, a2, tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconTextButR(uiBlock *block, int type, int retval, int icon, const char *str, int x, int y, short width, short height, PointerRNA *ptr, const char *propname, int index, float min, float max, float a1, float a2, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
|
|
|
uiBut *but;
|
2012-08-18 18:11:51 +00:00
|
|
|
but = ui_def_but_rna_propname(block, type, retval, str, x, y, width, height, ptr, propname, index, min, max, a1, a2, tip);
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update_and_icon_set(but, icon);
|
2013-11-21 14:43:08 +01:00
|
|
|
but->drawflag |= UI_BUT_ICON_LEFT;
|
2011-08-06 16:00:00 +00:00
|
|
|
return but;
|
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconTextButR_prop(uiBlock *block, int type, int retval, int icon, const char *str, int x, int y, short width, short height, PointerRNA *ptr, PropertyRNA *prop, int index, float min, float max, float a1, float a2, const char *tip)
|
2011-08-06 16:00:00 +00:00
|
|
|
{
|
|
|
|
uiBut *but;
|
2012-08-18 18:11:51 +00:00
|
|
|
but = ui_def_but_rna(block, type, retval, str, x, y, width, height, ptr, prop, index, min, max, a1, a2, tip);
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update_and_icon_set(but, icon);
|
2013-11-21 14:43:08 +01:00
|
|
|
but->drawflag |= UI_BUT_ICON_LEFT;
|
2008-12-16 07:55:43 +00:00
|
|
|
return but;
|
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconTextButO_ptr(uiBlock *block, int type, wmOperatorType *ot, int opcontext, int icon, const char *str, int x, int y, short width, short height, const char *tip)
|
2008-12-16 07:55:43 +00:00
|
|
|
{
|
|
|
|
uiBut *but;
|
2012-08-18 18:11:51 +00:00
|
|
|
but = ui_def_but_operator_ptr(block, type, ot, opcontext, str, x, y, width, height, tip);
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update_and_icon_set(but, icon);
|
2013-11-21 14:43:08 +01:00
|
|
|
but->drawflag |= UI_BUT_ICON_LEFT;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
return but;
|
|
|
|
}
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconTextButO(uiBlock *block, int type, const char *opname, int opcontext, int icon, const char *str, int x, int y, short width, short height, const char *tip)
|
2012-01-22 03:30:07 +00:00
|
|
|
{
|
|
|
|
wmOperatorType *ot = WM_operatortype_find(opname, 0);
|
2018-05-23 10:47:12 +02:00
|
|
|
if (str && str[0] == '\0')
|
2017-03-17 16:47:19 +03:00
|
|
|
return uiDefIconButO_ptr(block, type, ot, opcontext, icon, x, y, width, height, tip);
|
2012-08-18 18:11:51 +00:00
|
|
|
return uiDefIconTextButO_ptr(block, type, ot, opcontext, icon, str, x, y, width, height, tip);
|
2012-01-22 03:30:07 +00:00
|
|
|
}
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
/* END Button containing both string label and icon */
|
|
|
|
|
|
|
|
/* cruft to make uiBlock and uiBut private */
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
int UI_blocklist_min_y_get(ListBase *lb)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
|
|
|
uiBlock *block;
|
2012-03-30 01:51:25 +00:00
|
|
|
int min = 0;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
for (block = lb->first; block; block = block->next)
|
2012-08-18 16:53:46 +00:00
|
|
|
if (block == lb->first || block->rect.ymin < min)
|
|
|
|
min = block->rect.ymin;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
return min;
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_direction_set(uiBlock *block, char direction)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
block->direction = direction;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
|
|
|
/* this call escapes if there's alignment flags */
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_order_flip(uiBlock *block)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2014-06-25 19:33:35 +10:00
|
|
|
uiBut *but;
|
2012-03-30 01:51:25 +00:00
|
|
|
float centy, miny = 10000, maxy = -10000;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-03-24 06:38:07 +00:00
|
|
|
if (U.uiflag & USER_MENUFIXEDORDER)
|
2009-06-16 02:40:39 +00:00
|
|
|
return;
|
2012-03-24 06:38:07 +00:00
|
|
|
else if (block->flag & UI_BLOCK_NO_FLIP)
|
2009-08-21 02:51:56 +00:00
|
|
|
return;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
for (but = block->buttons.first; but; but = but->next) {
|
2013-11-21 14:43:08 +01:00
|
|
|
if (but->drawflag & UI_BUT_ALIGN) return;
|
2012-08-18 16:53:46 +00:00
|
|
|
if (but->rect.ymin < miny) miny = but->rect.ymin;
|
|
|
|
if (but->rect.ymax > maxy) maxy = but->rect.ymax;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
/* mirror trick */
|
2012-03-30 01:51:25 +00:00
|
|
|
centy = (miny + maxy) / 2.0f;
|
|
|
|
for (but = block->buttons.first; but; but = but->next) {
|
2012-08-18 16:53:46 +00:00
|
|
|
but->rect.ymin = centy - (but->rect.ymin - centy);
|
|
|
|
but->rect.ymax = centy - (but->rect.ymax - centy);
|
|
|
|
SWAP(float, but->rect.ymin, but->rect.ymax);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2014-11-15 14:32:23 +01:00
|
|
|
|
|
|
|
block->flag ^= UI_BLOCK_IS_FLIP;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_flag_enable(uiBlock *block, int flag)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
block->flag |= flag;
|
2009-03-25 20:49:15 +00:00
|
|
|
}
|
2.5
More cleanup!
- removed old UI font completely, including from uiBeginBlock
- emboss hints for uiBlock only have three types now;
Regular, Pulldown, or "Nothing" (only icon/text)
- removed old font path from Userdef
- removed all old button theme hinting
- removed old "auto block" to merge buttons in groups
(was only in use for radiosity buttons)
And went over all warnings. One hooray for make giving clean output :)
Well, we need uniform definitions for warnings, so people at least fix
them... here's the real bad bugs I found:
- in mesh code, a call to editmesh mixed *em and *me
- in armature, ED_util.h was not included, so no warnings for wrong call
to ED_undo_push()
- The extern Py api .h was not included in the bpy_interface.c, showing
a several calls using different args.
Further just added the missing includes, and removed unused vars.
2009-04-14 15:59:52 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_flag_disable(uiBlock *block, int flag)
|
2009-03-25 20:49:15 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
block->flag &= ~flag;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2.5
More cleanup!
- removed old UI font completely, including from uiBeginBlock
- emboss hints for uiBlock only have three types now;
Regular, Pulldown, or "Nothing" (only icon/text)
- removed old font path from Userdef
- removed all old button theme hinting
- removed old "auto block" to merge buttons in groups
(was only in use for radiosity buttons)
And went over all warnings. One hooray for make giving clean output :)
Well, we need uniform definitions for warnings, so people at least fix
them... here's the real bad bugs I found:
- in mesh code, a call to editmesh mixed *em and *me
- in armature, ED_util.h was not included, so no warnings for wrong call
to ED_undo_push()
- The extern Py api .h was not included in the bpy_interface.c, showing
a several calls using different args.
Further just added the missing includes, and removed unused vars.
2009-04-14 15:59:52 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_but_flag_enable(uiBut *but, int flag)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
but->flag |= flag;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
2.5
More cleanup!
- removed old UI font completely, including from uiBeginBlock
- emboss hints for uiBlock only have three types now;
Regular, Pulldown, or "Nothing" (only icon/text)
- removed old font path from Userdef
- removed all old button theme hinting
- removed old "auto block" to merge buttons in groups
(was only in use for radiosity buttons)
And went over all warnings. One hooray for make giving clean output :)
Well, we need uniform definitions for warnings, so people at least fix
them... here's the real bad bugs I found:
- in mesh code, a call to editmesh mixed *em and *me
- in armature, ED_util.h was not included, so no warnings for wrong call
to ED_undo_push()
- The extern Py api .h was not included in the bpy_interface.c, showing
a several calls using different args.
Further just added the missing includes, and removed unused vars.
2009-04-14 15:59:52 +00:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_but_flag_disable(uiBut *but, int flag)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
but->flag &= ~flag;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
Driver Keyframing: Some tweaks to make inserting keyframes on Driver F-Curves easier
Now, when trying to insert a keyframe on a driven property (using IKEY, or with
autokeying enabled), the keyframes will get created on the Driver's F-Curve
(instead of creating a new FCurve that goes into the active action, but will never
do anything). Furthermore, the x-value of the new keyframe will be the current
result of the driver expression.
Why/Motivations:
This way, it becomes easier to create corrective drivers, as you can position all
the targets the driver depends on, then adjust the driver value until it does what
you need, and then you keyframe that value to bake it into the Driver F-Curve
(in effect, "training" the computer how to behave in that case).
Usage Notes:
* In practice, that particular workflow is still quite clunky to achieve, due to some
quirks of how the driver system and the UI widgets interact. Specifically, you'll
need to disable/mute the driver before trying to edit the setting (to prevent the
driver from immediately resetting the value - before even autokey fires!). However,
if you're using the Graph Editor to preview/monitor/manage the keying process, you'll
then want to re-enable the driver before changing the targets, so that you can see
how much of a change you'll want to be applying!
* The warning about editing driver values may need to be disabled or selectively
knocked out. I had it disabled while testing this functionality, but it's actually
harmless in its current state (if just a bit annoying).
2016-03-24 19:33:13 +13:00
|
|
|
bool UI_but_flag_is_set(uiBut *but, int flag)
|
|
|
|
{
|
|
|
|
return (but->flag & flag) != 0;
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_but_drawflag_enable(uiBut *but, int flag)
|
2012-05-23 14:24:40 +00:00
|
|
|
{
|
|
|
|
but->drawflag |= flag;
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_but_drawflag_disable(uiBut *but, int flag)
|
2012-05-23 14:24:40 +00:00
|
|
|
{
|
|
|
|
but->drawflag &= ~flag;
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_but_type_set_menu_from_pulldown(uiBut *but)
|
2014-02-12 00:08:54 +11:00
|
|
|
{
|
2014-11-09 21:20:40 +01:00
|
|
|
BLI_assert(but->type == UI_BTYPE_PULLDOWN);
|
|
|
|
but->type = UI_BTYPE_MENU;
|
|
|
|
UI_but_drawflag_disable(but, UI_BUT_TEXT_RIGHT);
|
|
|
|
UI_but_drawflag_enable(but, UI_BUT_TEXT_LEFT);
|
2014-02-12 00:08:54 +11:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
int UI_but_return_value_get(uiBut *but)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
|
|
|
return but->retval;
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_but_drag_set_id(uiBut *but, ID *id)
|
Drag and drop 2.5 integration! Finally, slashdot regulars can use
Blender too now! :)
** Drag works as follows:
- drag-able items are defined by the standard interface ui toolkit
- each button can get this feature, via uiButSetDragXXX(but, ...).
There are calls to define drag-able images, ID blocks, RNA paths,
file paths, and so on. By default you drag an icon, exceptionally
an ImBuf
- Drag items are registered centrally in the WM, it allows more drag
items simultaneous too, but not implemented
** Drop works as follows:
- On mouse release, and if drag items exist in the WM, it converts
the mouse event to an EVT_DROP type. This event then gets the full
drag info as customdata
- drop regions are defined with WM_dropbox_add(), similar to keymaps
you can make a "drop map" this way, which become 'drop map handlers'
in the queues.
- next to that the UI kit handles some common button types (like
accepting ID or names) to be catching a drop event too.
- Every "drop box" has two callbacks:
- poll() = check if the event drag data is relevant for this box
- copy() = fill in custom properties in the dropbox to initialize
an operator
- The dropbox handler then calls its standard Operator with its
dropbox properties.
** Currently implemented
Drag items:
- ID icons in browse buttons
- ID icons in context menu of properties region
- ID icons in outliner and rna viewer
- FileBrowser icons
- FileBrowser preview images
Drag-able icons are subtly visualized by making them brighter a bit
on mouse-over. In case the icon is a button or UI element too (most
cases), the drag-able feature will make the item react to
mouse-release instead of mouse-press.
Drop options:
- UI buttons: ID and text buttons (paste name)
- View3d: Object ID drop copies object
- View3d: Material ID drop assigns to object under cursor
- View3d: Image ID drop assigns to object UV texture under cursor
- Sequencer: Path drop will add either Image or Movie strip
- Image window: Path drop will open image
** Drag and drop Notes:
- Dropping into another Blender window (from same application) works
too. I've added code that passes on mousemoves and clicks to other
windows, without activating them though. This does make using multi-window
Blender a bit friendler.
- Dropping a file path to an image, is not the same as dropping an
Image ID... keep this in mind. Sequencer for example wants paths to
be dropped, textures in 3d window wants an Image ID.
- Although drop boxes could be defined via Python, I suggest they're
part of the UI and editor design (= how we want an editor to work), and
not default offered configurable like keymaps.
- At the moment only one item can be dragged at a time. This is for
several reasons.... For one, Blender doesn't have a well defined
uniform way to define "what is selected" (files, outliner items, etc).
Secondly there's potential conflicts on what todo when you drop mixed
drag sets on spots. All undefined stuff... nice for later.
- Example to bypass the above: a collection of images that form a strip,
should be represented in filewindow as a single sequence anyway.
This then will fit well and gets handled neatly by design.
- Another option to check is to allow multiple options per drop... it
could show the operator as a sort of menu, allowing arrow or scrollwheel
to choose. For time being I'd prefer to try to design a singular drop
though, just offer only one drop action per data type on given spots.
- What does work already, but a tad slow, is to use a function that
detects an object (type) under cursor, so a drag item's option can be
further refined (like drop object on object = parent). (disabled)
** More notes
- Added saving for Region layouts (like split points for toolbar)
- Label buttons now handle mouse over
- File list: added full path entry for drop feature.
- Filesel bugfix: wm_operator_exec() got called there and fully handled,
while WM event code tried same. Added new OPERATOR_HANDLED flag for this.
Maybe python needs it too?
- Cocoa: added window move event, so multi-win setups work OK (didnt save).
- Interface_handlers.c: removed win->active
- Severe area copy bug: area handlers were not set to NULL
- Filesel bugfix: next/prev folder list was not copied on area copies
** Leftover todos
- Cocoa windows seem to hang on cases still... needs check
- Cocoa 'draw overlap' swap doesn't work
- Cocoa window loses focus permanently on using Spotlight
(for these reasons, makefile building has Carbon as default atm)
- ListView templates in UI cannot become dragged yet, needs review...
it consists of two overlapping UI elements, preventing handling icon clicks.
- There's already Ghost library code to handle dropping from OS
into Blender window. I've noticed this code is unfinished for Macs, but
seems to be complete for Windows. Needs test... currently, an external
drop event will print in console when succesfully delivered to Blender's WM.
2010-01-26 18:18:21 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
but->dragtype = WM_DRAG_ID;
|
2015-08-10 18:01:11 +02:00
|
|
|
if ((but->dragflag & UI_BUT_DRAGPOIN_FREE)) {
|
|
|
|
MEM_SAFE_FREE(but->dragpoin);
|
|
|
|
but->dragflag &= ~UI_BUT_DRAGPOIN_FREE;
|
|
|
|
}
|
2012-03-30 01:51:25 +00:00
|
|
|
but->dragpoin = (void *)id;
|
Drag and drop 2.5 integration! Finally, slashdot regulars can use
Blender too now! :)
** Drag works as follows:
- drag-able items are defined by the standard interface ui toolkit
- each button can get this feature, via uiButSetDragXXX(but, ...).
There are calls to define drag-able images, ID blocks, RNA paths,
file paths, and so on. By default you drag an icon, exceptionally
an ImBuf
- Drag items are registered centrally in the WM, it allows more drag
items simultaneous too, but not implemented
** Drop works as follows:
- On mouse release, and if drag items exist in the WM, it converts
the mouse event to an EVT_DROP type. This event then gets the full
drag info as customdata
- drop regions are defined with WM_dropbox_add(), similar to keymaps
you can make a "drop map" this way, which become 'drop map handlers'
in the queues.
- next to that the UI kit handles some common button types (like
accepting ID or names) to be catching a drop event too.
- Every "drop box" has two callbacks:
- poll() = check if the event drag data is relevant for this box
- copy() = fill in custom properties in the dropbox to initialize
an operator
- The dropbox handler then calls its standard Operator with its
dropbox properties.
** Currently implemented
Drag items:
- ID icons in browse buttons
- ID icons in context menu of properties region
- ID icons in outliner and rna viewer
- FileBrowser icons
- FileBrowser preview images
Drag-able icons are subtly visualized by making them brighter a bit
on mouse-over. In case the icon is a button or UI element too (most
cases), the drag-able feature will make the item react to
mouse-release instead of mouse-press.
Drop options:
- UI buttons: ID and text buttons (paste name)
- View3d: Object ID drop copies object
- View3d: Material ID drop assigns to object under cursor
- View3d: Image ID drop assigns to object UV texture under cursor
- Sequencer: Path drop will add either Image or Movie strip
- Image window: Path drop will open image
** Drag and drop Notes:
- Dropping into another Blender window (from same application) works
too. I've added code that passes on mousemoves and clicks to other
windows, without activating them though. This does make using multi-window
Blender a bit friendler.
- Dropping a file path to an image, is not the same as dropping an
Image ID... keep this in mind. Sequencer for example wants paths to
be dropped, textures in 3d window wants an Image ID.
- Although drop boxes could be defined via Python, I suggest they're
part of the UI and editor design (= how we want an editor to work), and
not default offered configurable like keymaps.
- At the moment only one item can be dragged at a time. This is for
several reasons.... For one, Blender doesn't have a well defined
uniform way to define "what is selected" (files, outliner items, etc).
Secondly there's potential conflicts on what todo when you drop mixed
drag sets on spots. All undefined stuff... nice for later.
- Example to bypass the above: a collection of images that form a strip,
should be represented in filewindow as a single sequence anyway.
This then will fit well and gets handled neatly by design.
- Another option to check is to allow multiple options per drop... it
could show the operator as a sort of menu, allowing arrow or scrollwheel
to choose. For time being I'd prefer to try to design a singular drop
though, just offer only one drop action per data type on given spots.
- What does work already, but a tad slow, is to use a function that
detects an object (type) under cursor, so a drag item's option can be
further refined (like drop object on object = parent). (disabled)
** More notes
- Added saving for Region layouts (like split points for toolbar)
- Label buttons now handle mouse over
- File list: added full path entry for drop feature.
- Filesel bugfix: wm_operator_exec() got called there and fully handled,
while WM event code tried same. Added new OPERATOR_HANDLED flag for this.
Maybe python needs it too?
- Cocoa: added window move event, so multi-win setups work OK (didnt save).
- Interface_handlers.c: removed win->active
- Severe area copy bug: area handlers were not set to NULL
- Filesel bugfix: next/prev folder list was not copied on area copies
** Leftover todos
- Cocoa windows seem to hang on cases still... needs check
- Cocoa 'draw overlap' swap doesn't work
- Cocoa window loses focus permanently on using Spotlight
(for these reasons, makefile building has Carbon as default atm)
- ListView templates in UI cannot become dragged yet, needs review...
it consists of two overlapping UI elements, preventing handling icon clicks.
- There's already Ghost library code to handle dropping from OS
into Blender window. I've noticed this code is unfinished for Macs, but
seems to be complete for Windows. Needs test... currently, an external
drop event will print in console when succesfully delivered to Blender's WM.
2010-01-26 18:18:21 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_but_drag_set_rna(uiBut *but, PointerRNA *ptr)
|
Drag and drop 2.5 integration! Finally, slashdot regulars can use
Blender too now! :)
** Drag works as follows:
- drag-able items are defined by the standard interface ui toolkit
- each button can get this feature, via uiButSetDragXXX(but, ...).
There are calls to define drag-able images, ID blocks, RNA paths,
file paths, and so on. By default you drag an icon, exceptionally
an ImBuf
- Drag items are registered centrally in the WM, it allows more drag
items simultaneous too, but not implemented
** Drop works as follows:
- On mouse release, and if drag items exist in the WM, it converts
the mouse event to an EVT_DROP type. This event then gets the full
drag info as customdata
- drop regions are defined with WM_dropbox_add(), similar to keymaps
you can make a "drop map" this way, which become 'drop map handlers'
in the queues.
- next to that the UI kit handles some common button types (like
accepting ID or names) to be catching a drop event too.
- Every "drop box" has two callbacks:
- poll() = check if the event drag data is relevant for this box
- copy() = fill in custom properties in the dropbox to initialize
an operator
- The dropbox handler then calls its standard Operator with its
dropbox properties.
** Currently implemented
Drag items:
- ID icons in browse buttons
- ID icons in context menu of properties region
- ID icons in outliner and rna viewer
- FileBrowser icons
- FileBrowser preview images
Drag-able icons are subtly visualized by making them brighter a bit
on mouse-over. In case the icon is a button or UI element too (most
cases), the drag-able feature will make the item react to
mouse-release instead of mouse-press.
Drop options:
- UI buttons: ID and text buttons (paste name)
- View3d: Object ID drop copies object
- View3d: Material ID drop assigns to object under cursor
- View3d: Image ID drop assigns to object UV texture under cursor
- Sequencer: Path drop will add either Image or Movie strip
- Image window: Path drop will open image
** Drag and drop Notes:
- Dropping into another Blender window (from same application) works
too. I've added code that passes on mousemoves and clicks to other
windows, without activating them though. This does make using multi-window
Blender a bit friendler.
- Dropping a file path to an image, is not the same as dropping an
Image ID... keep this in mind. Sequencer for example wants paths to
be dropped, textures in 3d window wants an Image ID.
- Although drop boxes could be defined via Python, I suggest they're
part of the UI and editor design (= how we want an editor to work), and
not default offered configurable like keymaps.
- At the moment only one item can be dragged at a time. This is for
several reasons.... For one, Blender doesn't have a well defined
uniform way to define "what is selected" (files, outliner items, etc).
Secondly there's potential conflicts on what todo when you drop mixed
drag sets on spots. All undefined stuff... nice for later.
- Example to bypass the above: a collection of images that form a strip,
should be represented in filewindow as a single sequence anyway.
This then will fit well and gets handled neatly by design.
- Another option to check is to allow multiple options per drop... it
could show the operator as a sort of menu, allowing arrow or scrollwheel
to choose. For time being I'd prefer to try to design a singular drop
though, just offer only one drop action per data type on given spots.
- What does work already, but a tad slow, is to use a function that
detects an object (type) under cursor, so a drag item's option can be
further refined (like drop object on object = parent). (disabled)
** More notes
- Added saving for Region layouts (like split points for toolbar)
- Label buttons now handle mouse over
- File list: added full path entry for drop feature.
- Filesel bugfix: wm_operator_exec() got called there and fully handled,
while WM event code tried same. Added new OPERATOR_HANDLED flag for this.
Maybe python needs it too?
- Cocoa: added window move event, so multi-win setups work OK (didnt save).
- Interface_handlers.c: removed win->active
- Severe area copy bug: area handlers were not set to NULL
- Filesel bugfix: next/prev folder list was not copied on area copies
** Leftover todos
- Cocoa windows seem to hang on cases still... needs check
- Cocoa 'draw overlap' swap doesn't work
- Cocoa window loses focus permanently on using Spotlight
(for these reasons, makefile building has Carbon as default atm)
- ListView templates in UI cannot become dragged yet, needs review...
it consists of two overlapping UI elements, preventing handling icon clicks.
- There's already Ghost library code to handle dropping from OS
into Blender window. I've noticed this code is unfinished for Macs, but
seems to be complete for Windows. Needs test... currently, an external
drop event will print in console when succesfully delivered to Blender's WM.
2010-01-26 18:18:21 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
but->dragtype = WM_DRAG_RNA;
|
2015-08-10 18:01:11 +02:00
|
|
|
if ((but->dragflag & UI_BUT_DRAGPOIN_FREE)) {
|
|
|
|
MEM_SAFE_FREE(but->dragpoin);
|
|
|
|
but->dragflag &= ~UI_BUT_DRAGPOIN_FREE;
|
|
|
|
}
|
2012-03-30 01:51:25 +00:00
|
|
|
but->dragpoin = (void *)ptr;
|
Drag and drop 2.5 integration! Finally, slashdot regulars can use
Blender too now! :)
** Drag works as follows:
- drag-able items are defined by the standard interface ui toolkit
- each button can get this feature, via uiButSetDragXXX(but, ...).
There are calls to define drag-able images, ID blocks, RNA paths,
file paths, and so on. By default you drag an icon, exceptionally
an ImBuf
- Drag items are registered centrally in the WM, it allows more drag
items simultaneous too, but not implemented
** Drop works as follows:
- On mouse release, and if drag items exist in the WM, it converts
the mouse event to an EVT_DROP type. This event then gets the full
drag info as customdata
- drop regions are defined with WM_dropbox_add(), similar to keymaps
you can make a "drop map" this way, which become 'drop map handlers'
in the queues.
- next to that the UI kit handles some common button types (like
accepting ID or names) to be catching a drop event too.
- Every "drop box" has two callbacks:
- poll() = check if the event drag data is relevant for this box
- copy() = fill in custom properties in the dropbox to initialize
an operator
- The dropbox handler then calls its standard Operator with its
dropbox properties.
** Currently implemented
Drag items:
- ID icons in browse buttons
- ID icons in context menu of properties region
- ID icons in outliner and rna viewer
- FileBrowser icons
- FileBrowser preview images
Drag-able icons are subtly visualized by making them brighter a bit
on mouse-over. In case the icon is a button or UI element too (most
cases), the drag-able feature will make the item react to
mouse-release instead of mouse-press.
Drop options:
- UI buttons: ID and text buttons (paste name)
- View3d: Object ID drop copies object
- View3d: Material ID drop assigns to object under cursor
- View3d: Image ID drop assigns to object UV texture under cursor
- Sequencer: Path drop will add either Image or Movie strip
- Image window: Path drop will open image
** Drag and drop Notes:
- Dropping into another Blender window (from same application) works
too. I've added code that passes on mousemoves and clicks to other
windows, without activating them though. This does make using multi-window
Blender a bit friendler.
- Dropping a file path to an image, is not the same as dropping an
Image ID... keep this in mind. Sequencer for example wants paths to
be dropped, textures in 3d window wants an Image ID.
- Although drop boxes could be defined via Python, I suggest they're
part of the UI and editor design (= how we want an editor to work), and
not default offered configurable like keymaps.
- At the moment only one item can be dragged at a time. This is for
several reasons.... For one, Blender doesn't have a well defined
uniform way to define "what is selected" (files, outliner items, etc).
Secondly there's potential conflicts on what todo when you drop mixed
drag sets on spots. All undefined stuff... nice for later.
- Example to bypass the above: a collection of images that form a strip,
should be represented in filewindow as a single sequence anyway.
This then will fit well and gets handled neatly by design.
- Another option to check is to allow multiple options per drop... it
could show the operator as a sort of menu, allowing arrow or scrollwheel
to choose. For time being I'd prefer to try to design a singular drop
though, just offer only one drop action per data type on given spots.
- What does work already, but a tad slow, is to use a function that
detects an object (type) under cursor, so a drag item's option can be
further refined (like drop object on object = parent). (disabled)
** More notes
- Added saving for Region layouts (like split points for toolbar)
- Label buttons now handle mouse over
- File list: added full path entry for drop feature.
- Filesel bugfix: wm_operator_exec() got called there and fully handled,
while WM event code tried same. Added new OPERATOR_HANDLED flag for this.
Maybe python needs it too?
- Cocoa: added window move event, so multi-win setups work OK (didnt save).
- Interface_handlers.c: removed win->active
- Severe area copy bug: area handlers were not set to NULL
- Filesel bugfix: next/prev folder list was not copied on area copies
** Leftover todos
- Cocoa windows seem to hang on cases still... needs check
- Cocoa 'draw overlap' swap doesn't work
- Cocoa window loses focus permanently on using Spotlight
(for these reasons, makefile building has Carbon as default atm)
- ListView templates in UI cannot become dragged yet, needs review...
it consists of two overlapping UI elements, preventing handling icon clicks.
- There's already Ghost library code to handle dropping from OS
into Blender window. I've noticed this code is unfinished for Macs, but
seems to be complete for Windows. Needs test... currently, an external
drop event will print in console when succesfully delivered to Blender's WM.
2010-01-26 18:18:21 +00:00
|
|
|
}
|
|
|
|
|
2015-08-10 18:01:11 +02:00
|
|
|
void UI_but_drag_set_path(uiBut *but, const char *path, const bool use_free)
|
Drag and drop 2.5 integration! Finally, slashdot regulars can use
Blender too now! :)
** Drag works as follows:
- drag-able items are defined by the standard interface ui toolkit
- each button can get this feature, via uiButSetDragXXX(but, ...).
There are calls to define drag-able images, ID blocks, RNA paths,
file paths, and so on. By default you drag an icon, exceptionally
an ImBuf
- Drag items are registered centrally in the WM, it allows more drag
items simultaneous too, but not implemented
** Drop works as follows:
- On mouse release, and if drag items exist in the WM, it converts
the mouse event to an EVT_DROP type. This event then gets the full
drag info as customdata
- drop regions are defined with WM_dropbox_add(), similar to keymaps
you can make a "drop map" this way, which become 'drop map handlers'
in the queues.
- next to that the UI kit handles some common button types (like
accepting ID or names) to be catching a drop event too.
- Every "drop box" has two callbacks:
- poll() = check if the event drag data is relevant for this box
- copy() = fill in custom properties in the dropbox to initialize
an operator
- The dropbox handler then calls its standard Operator with its
dropbox properties.
** Currently implemented
Drag items:
- ID icons in browse buttons
- ID icons in context menu of properties region
- ID icons in outliner and rna viewer
- FileBrowser icons
- FileBrowser preview images
Drag-able icons are subtly visualized by making them brighter a bit
on mouse-over. In case the icon is a button or UI element too (most
cases), the drag-able feature will make the item react to
mouse-release instead of mouse-press.
Drop options:
- UI buttons: ID and text buttons (paste name)
- View3d: Object ID drop copies object
- View3d: Material ID drop assigns to object under cursor
- View3d: Image ID drop assigns to object UV texture under cursor
- Sequencer: Path drop will add either Image or Movie strip
- Image window: Path drop will open image
** Drag and drop Notes:
- Dropping into another Blender window (from same application) works
too. I've added code that passes on mousemoves and clicks to other
windows, without activating them though. This does make using multi-window
Blender a bit friendler.
- Dropping a file path to an image, is not the same as dropping an
Image ID... keep this in mind. Sequencer for example wants paths to
be dropped, textures in 3d window wants an Image ID.
- Although drop boxes could be defined via Python, I suggest they're
part of the UI and editor design (= how we want an editor to work), and
not default offered configurable like keymaps.
- At the moment only one item can be dragged at a time. This is for
several reasons.... For one, Blender doesn't have a well defined
uniform way to define "what is selected" (files, outliner items, etc).
Secondly there's potential conflicts on what todo when you drop mixed
drag sets on spots. All undefined stuff... nice for later.
- Example to bypass the above: a collection of images that form a strip,
should be represented in filewindow as a single sequence anyway.
This then will fit well and gets handled neatly by design.
- Another option to check is to allow multiple options per drop... it
could show the operator as a sort of menu, allowing arrow or scrollwheel
to choose. For time being I'd prefer to try to design a singular drop
though, just offer only one drop action per data type on given spots.
- What does work already, but a tad slow, is to use a function that
detects an object (type) under cursor, so a drag item's option can be
further refined (like drop object on object = parent). (disabled)
** More notes
- Added saving for Region layouts (like split points for toolbar)
- Label buttons now handle mouse over
- File list: added full path entry for drop feature.
- Filesel bugfix: wm_operator_exec() got called there and fully handled,
while WM event code tried same. Added new OPERATOR_HANDLED flag for this.
Maybe python needs it too?
- Cocoa: added window move event, so multi-win setups work OK (didnt save).
- Interface_handlers.c: removed win->active
- Severe area copy bug: area handlers were not set to NULL
- Filesel bugfix: next/prev folder list was not copied on area copies
** Leftover todos
- Cocoa windows seem to hang on cases still... needs check
- Cocoa 'draw overlap' swap doesn't work
- Cocoa window loses focus permanently on using Spotlight
(for these reasons, makefile building has Carbon as default atm)
- ListView templates in UI cannot become dragged yet, needs review...
it consists of two overlapping UI elements, preventing handling icon clicks.
- There's already Ghost library code to handle dropping from OS
into Blender window. I've noticed this code is unfinished for Macs, but
seems to be complete for Windows. Needs test... currently, an external
drop event will print in console when succesfully delivered to Blender's WM.
2010-01-26 18:18:21 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
but->dragtype = WM_DRAG_PATH;
|
2015-08-10 18:01:11 +02:00
|
|
|
if ((but->dragflag & UI_BUT_DRAGPOIN_FREE)) {
|
|
|
|
MEM_SAFE_FREE(but->dragpoin);
|
|
|
|
but->dragflag &= ~UI_BUT_DRAGPOIN_FREE;
|
|
|
|
}
|
2012-03-30 01:51:25 +00:00
|
|
|
but->dragpoin = (void *)path;
|
2015-08-10 18:01:11 +02:00
|
|
|
if (use_free) {
|
|
|
|
but->dragflag |= UI_BUT_DRAGPOIN_FREE;
|
|
|
|
}
|
Drag and drop 2.5 integration! Finally, slashdot regulars can use
Blender too now! :)
** Drag works as follows:
- drag-able items are defined by the standard interface ui toolkit
- each button can get this feature, via uiButSetDragXXX(but, ...).
There are calls to define drag-able images, ID blocks, RNA paths,
file paths, and so on. By default you drag an icon, exceptionally
an ImBuf
- Drag items are registered centrally in the WM, it allows more drag
items simultaneous too, but not implemented
** Drop works as follows:
- On mouse release, and if drag items exist in the WM, it converts
the mouse event to an EVT_DROP type. This event then gets the full
drag info as customdata
- drop regions are defined with WM_dropbox_add(), similar to keymaps
you can make a "drop map" this way, which become 'drop map handlers'
in the queues.
- next to that the UI kit handles some common button types (like
accepting ID or names) to be catching a drop event too.
- Every "drop box" has two callbacks:
- poll() = check if the event drag data is relevant for this box
- copy() = fill in custom properties in the dropbox to initialize
an operator
- The dropbox handler then calls its standard Operator with its
dropbox properties.
** Currently implemented
Drag items:
- ID icons in browse buttons
- ID icons in context menu of properties region
- ID icons in outliner and rna viewer
- FileBrowser icons
- FileBrowser preview images
Drag-able icons are subtly visualized by making them brighter a bit
on mouse-over. In case the icon is a button or UI element too (most
cases), the drag-able feature will make the item react to
mouse-release instead of mouse-press.
Drop options:
- UI buttons: ID and text buttons (paste name)
- View3d: Object ID drop copies object
- View3d: Material ID drop assigns to object under cursor
- View3d: Image ID drop assigns to object UV texture under cursor
- Sequencer: Path drop will add either Image or Movie strip
- Image window: Path drop will open image
** Drag and drop Notes:
- Dropping into another Blender window (from same application) works
too. I've added code that passes on mousemoves and clicks to other
windows, without activating them though. This does make using multi-window
Blender a bit friendler.
- Dropping a file path to an image, is not the same as dropping an
Image ID... keep this in mind. Sequencer for example wants paths to
be dropped, textures in 3d window wants an Image ID.
- Although drop boxes could be defined via Python, I suggest they're
part of the UI and editor design (= how we want an editor to work), and
not default offered configurable like keymaps.
- At the moment only one item can be dragged at a time. This is for
several reasons.... For one, Blender doesn't have a well defined
uniform way to define "what is selected" (files, outliner items, etc).
Secondly there's potential conflicts on what todo when you drop mixed
drag sets on spots. All undefined stuff... nice for later.
- Example to bypass the above: a collection of images that form a strip,
should be represented in filewindow as a single sequence anyway.
This then will fit well and gets handled neatly by design.
- Another option to check is to allow multiple options per drop... it
could show the operator as a sort of menu, allowing arrow or scrollwheel
to choose. For time being I'd prefer to try to design a singular drop
though, just offer only one drop action per data type on given spots.
- What does work already, but a tad slow, is to use a function that
detects an object (type) under cursor, so a drag item's option can be
further refined (like drop object on object = parent). (disabled)
** More notes
- Added saving for Region layouts (like split points for toolbar)
- Label buttons now handle mouse over
- File list: added full path entry for drop feature.
- Filesel bugfix: wm_operator_exec() got called there and fully handled,
while WM event code tried same. Added new OPERATOR_HANDLED flag for this.
Maybe python needs it too?
- Cocoa: added window move event, so multi-win setups work OK (didnt save).
- Interface_handlers.c: removed win->active
- Severe area copy bug: area handlers were not set to NULL
- Filesel bugfix: next/prev folder list was not copied on area copies
** Leftover todos
- Cocoa windows seem to hang on cases still... needs check
- Cocoa 'draw overlap' swap doesn't work
- Cocoa window loses focus permanently on using Spotlight
(for these reasons, makefile building has Carbon as default atm)
- ListView templates in UI cannot become dragged yet, needs review...
it consists of two overlapping UI elements, preventing handling icon clicks.
- There's already Ghost library code to handle dropping from OS
into Blender window. I've noticed this code is unfinished for Macs, but
seems to be complete for Windows. Needs test... currently, an external
drop event will print in console when succesfully delivered to Blender's WM.
2010-01-26 18:18:21 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_but_drag_set_name(uiBut *but, const char *name)
|
Drag and drop 2.5 integration! Finally, slashdot regulars can use
Blender too now! :)
** Drag works as follows:
- drag-able items are defined by the standard interface ui toolkit
- each button can get this feature, via uiButSetDragXXX(but, ...).
There are calls to define drag-able images, ID blocks, RNA paths,
file paths, and so on. By default you drag an icon, exceptionally
an ImBuf
- Drag items are registered centrally in the WM, it allows more drag
items simultaneous too, but not implemented
** Drop works as follows:
- On mouse release, and if drag items exist in the WM, it converts
the mouse event to an EVT_DROP type. This event then gets the full
drag info as customdata
- drop regions are defined with WM_dropbox_add(), similar to keymaps
you can make a "drop map" this way, which become 'drop map handlers'
in the queues.
- next to that the UI kit handles some common button types (like
accepting ID or names) to be catching a drop event too.
- Every "drop box" has two callbacks:
- poll() = check if the event drag data is relevant for this box
- copy() = fill in custom properties in the dropbox to initialize
an operator
- The dropbox handler then calls its standard Operator with its
dropbox properties.
** Currently implemented
Drag items:
- ID icons in browse buttons
- ID icons in context menu of properties region
- ID icons in outliner and rna viewer
- FileBrowser icons
- FileBrowser preview images
Drag-able icons are subtly visualized by making them brighter a bit
on mouse-over. In case the icon is a button or UI element too (most
cases), the drag-able feature will make the item react to
mouse-release instead of mouse-press.
Drop options:
- UI buttons: ID and text buttons (paste name)
- View3d: Object ID drop copies object
- View3d: Material ID drop assigns to object under cursor
- View3d: Image ID drop assigns to object UV texture under cursor
- Sequencer: Path drop will add either Image or Movie strip
- Image window: Path drop will open image
** Drag and drop Notes:
- Dropping into another Blender window (from same application) works
too. I've added code that passes on mousemoves and clicks to other
windows, without activating them though. This does make using multi-window
Blender a bit friendler.
- Dropping a file path to an image, is not the same as dropping an
Image ID... keep this in mind. Sequencer for example wants paths to
be dropped, textures in 3d window wants an Image ID.
- Although drop boxes could be defined via Python, I suggest they're
part of the UI and editor design (= how we want an editor to work), and
not default offered configurable like keymaps.
- At the moment only one item can be dragged at a time. This is for
several reasons.... For one, Blender doesn't have a well defined
uniform way to define "what is selected" (files, outliner items, etc).
Secondly there's potential conflicts on what todo when you drop mixed
drag sets on spots. All undefined stuff... nice for later.
- Example to bypass the above: a collection of images that form a strip,
should be represented in filewindow as a single sequence anyway.
This then will fit well and gets handled neatly by design.
- Another option to check is to allow multiple options per drop... it
could show the operator as a sort of menu, allowing arrow or scrollwheel
to choose. For time being I'd prefer to try to design a singular drop
though, just offer only one drop action per data type on given spots.
- What does work already, but a tad slow, is to use a function that
detects an object (type) under cursor, so a drag item's option can be
further refined (like drop object on object = parent). (disabled)
** More notes
- Added saving for Region layouts (like split points for toolbar)
- Label buttons now handle mouse over
- File list: added full path entry for drop feature.
- Filesel bugfix: wm_operator_exec() got called there and fully handled,
while WM event code tried same. Added new OPERATOR_HANDLED flag for this.
Maybe python needs it too?
- Cocoa: added window move event, so multi-win setups work OK (didnt save).
- Interface_handlers.c: removed win->active
- Severe area copy bug: area handlers were not set to NULL
- Filesel bugfix: next/prev folder list was not copied on area copies
** Leftover todos
- Cocoa windows seem to hang on cases still... needs check
- Cocoa 'draw overlap' swap doesn't work
- Cocoa window loses focus permanently on using Spotlight
(for these reasons, makefile building has Carbon as default atm)
- ListView templates in UI cannot become dragged yet, needs review...
it consists of two overlapping UI elements, preventing handling icon clicks.
- There's already Ghost library code to handle dropping from OS
into Blender window. I've noticed this code is unfinished for Macs, but
seems to be complete for Windows. Needs test... currently, an external
drop event will print in console when succesfully delivered to Blender's WM.
2010-01-26 18:18:21 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
but->dragtype = WM_DRAG_NAME;
|
2015-08-10 18:01:11 +02:00
|
|
|
if ((but->dragflag & UI_BUT_DRAGPOIN_FREE)) {
|
|
|
|
MEM_SAFE_FREE(but->dragpoin);
|
|
|
|
but->dragflag &= ~UI_BUT_DRAGPOIN_FREE;
|
|
|
|
}
|
2012-03-30 01:51:25 +00:00
|
|
|
but->dragpoin = (void *)name;
|
Drag and drop 2.5 integration! Finally, slashdot regulars can use
Blender too now! :)
** Drag works as follows:
- drag-able items are defined by the standard interface ui toolkit
- each button can get this feature, via uiButSetDragXXX(but, ...).
There are calls to define drag-able images, ID blocks, RNA paths,
file paths, and so on. By default you drag an icon, exceptionally
an ImBuf
- Drag items are registered centrally in the WM, it allows more drag
items simultaneous too, but not implemented
** Drop works as follows:
- On mouse release, and if drag items exist in the WM, it converts
the mouse event to an EVT_DROP type. This event then gets the full
drag info as customdata
- drop regions are defined with WM_dropbox_add(), similar to keymaps
you can make a "drop map" this way, which become 'drop map handlers'
in the queues.
- next to that the UI kit handles some common button types (like
accepting ID or names) to be catching a drop event too.
- Every "drop box" has two callbacks:
- poll() = check if the event drag data is relevant for this box
- copy() = fill in custom properties in the dropbox to initialize
an operator
- The dropbox handler then calls its standard Operator with its
dropbox properties.
** Currently implemented
Drag items:
- ID icons in browse buttons
- ID icons in context menu of properties region
- ID icons in outliner and rna viewer
- FileBrowser icons
- FileBrowser preview images
Drag-able icons are subtly visualized by making them brighter a bit
on mouse-over. In case the icon is a button or UI element too (most
cases), the drag-able feature will make the item react to
mouse-release instead of mouse-press.
Drop options:
- UI buttons: ID and text buttons (paste name)
- View3d: Object ID drop copies object
- View3d: Material ID drop assigns to object under cursor
- View3d: Image ID drop assigns to object UV texture under cursor
- Sequencer: Path drop will add either Image or Movie strip
- Image window: Path drop will open image
** Drag and drop Notes:
- Dropping into another Blender window (from same application) works
too. I've added code that passes on mousemoves and clicks to other
windows, without activating them though. This does make using multi-window
Blender a bit friendler.
- Dropping a file path to an image, is not the same as dropping an
Image ID... keep this in mind. Sequencer for example wants paths to
be dropped, textures in 3d window wants an Image ID.
- Although drop boxes could be defined via Python, I suggest they're
part of the UI and editor design (= how we want an editor to work), and
not default offered configurable like keymaps.
- At the moment only one item can be dragged at a time. This is for
several reasons.... For one, Blender doesn't have a well defined
uniform way to define "what is selected" (files, outliner items, etc).
Secondly there's potential conflicts on what todo when you drop mixed
drag sets on spots. All undefined stuff... nice for later.
- Example to bypass the above: a collection of images that form a strip,
should be represented in filewindow as a single sequence anyway.
This then will fit well and gets handled neatly by design.
- Another option to check is to allow multiple options per drop... it
could show the operator as a sort of menu, allowing arrow or scrollwheel
to choose. For time being I'd prefer to try to design a singular drop
though, just offer only one drop action per data type on given spots.
- What does work already, but a tad slow, is to use a function that
detects an object (type) under cursor, so a drag item's option can be
further refined (like drop object on object = parent). (disabled)
** More notes
- Added saving for Region layouts (like split points for toolbar)
- Label buttons now handle mouse over
- File list: added full path entry for drop feature.
- Filesel bugfix: wm_operator_exec() got called there and fully handled,
while WM event code tried same. Added new OPERATOR_HANDLED flag for this.
Maybe python needs it too?
- Cocoa: added window move event, so multi-win setups work OK (didnt save).
- Interface_handlers.c: removed win->active
- Severe area copy bug: area handlers were not set to NULL
- Filesel bugfix: next/prev folder list was not copied on area copies
** Leftover todos
- Cocoa windows seem to hang on cases still... needs check
- Cocoa 'draw overlap' swap doesn't work
- Cocoa window loses focus permanently on using Spotlight
(for these reasons, makefile building has Carbon as default atm)
- ListView templates in UI cannot become dragged yet, needs review...
it consists of two overlapping UI elements, preventing handling icon clicks.
- There's already Ghost library code to handle dropping from OS
into Blender window. I've noticed this code is unfinished for Macs, but
seems to be complete for Windows. Needs test... currently, an external
drop event will print in console when succesfully delivered to Blender's WM.
2010-01-26 18:18:21 +00:00
|
|
|
}
|
|
|
|
|
|
|
|
/* value from button itself */
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_but_drag_set_value(uiBut *but)
|
Drag and drop 2.5 integration! Finally, slashdot regulars can use
Blender too now! :)
** Drag works as follows:
- drag-able items are defined by the standard interface ui toolkit
- each button can get this feature, via uiButSetDragXXX(but, ...).
There are calls to define drag-able images, ID blocks, RNA paths,
file paths, and so on. By default you drag an icon, exceptionally
an ImBuf
- Drag items are registered centrally in the WM, it allows more drag
items simultaneous too, but not implemented
** Drop works as follows:
- On mouse release, and if drag items exist in the WM, it converts
the mouse event to an EVT_DROP type. This event then gets the full
drag info as customdata
- drop regions are defined with WM_dropbox_add(), similar to keymaps
you can make a "drop map" this way, which become 'drop map handlers'
in the queues.
- next to that the UI kit handles some common button types (like
accepting ID or names) to be catching a drop event too.
- Every "drop box" has two callbacks:
- poll() = check if the event drag data is relevant for this box
- copy() = fill in custom properties in the dropbox to initialize
an operator
- The dropbox handler then calls its standard Operator with its
dropbox properties.
** Currently implemented
Drag items:
- ID icons in browse buttons
- ID icons in context menu of properties region
- ID icons in outliner and rna viewer
- FileBrowser icons
- FileBrowser preview images
Drag-able icons are subtly visualized by making them brighter a bit
on mouse-over. In case the icon is a button or UI element too (most
cases), the drag-able feature will make the item react to
mouse-release instead of mouse-press.
Drop options:
- UI buttons: ID and text buttons (paste name)
- View3d: Object ID drop copies object
- View3d: Material ID drop assigns to object under cursor
- View3d: Image ID drop assigns to object UV texture under cursor
- Sequencer: Path drop will add either Image or Movie strip
- Image window: Path drop will open image
** Drag and drop Notes:
- Dropping into another Blender window (from same application) works
too. I've added code that passes on mousemoves and clicks to other
windows, without activating them though. This does make using multi-window
Blender a bit friendler.
- Dropping a file path to an image, is not the same as dropping an
Image ID... keep this in mind. Sequencer for example wants paths to
be dropped, textures in 3d window wants an Image ID.
- Although drop boxes could be defined via Python, I suggest they're
part of the UI and editor design (= how we want an editor to work), and
not default offered configurable like keymaps.
- At the moment only one item can be dragged at a time. This is for
several reasons.... For one, Blender doesn't have a well defined
uniform way to define "what is selected" (files, outliner items, etc).
Secondly there's potential conflicts on what todo when you drop mixed
drag sets on spots. All undefined stuff... nice for later.
- Example to bypass the above: a collection of images that form a strip,
should be represented in filewindow as a single sequence anyway.
This then will fit well and gets handled neatly by design.
- Another option to check is to allow multiple options per drop... it
could show the operator as a sort of menu, allowing arrow or scrollwheel
to choose. For time being I'd prefer to try to design a singular drop
though, just offer only one drop action per data type on given spots.
- What does work already, but a tad slow, is to use a function that
detects an object (type) under cursor, so a drag item's option can be
further refined (like drop object on object = parent). (disabled)
** More notes
- Added saving for Region layouts (like split points for toolbar)
- Label buttons now handle mouse over
- File list: added full path entry for drop feature.
- Filesel bugfix: wm_operator_exec() got called there and fully handled,
while WM event code tried same. Added new OPERATOR_HANDLED flag for this.
Maybe python needs it too?
- Cocoa: added window move event, so multi-win setups work OK (didnt save).
- Interface_handlers.c: removed win->active
- Severe area copy bug: area handlers were not set to NULL
- Filesel bugfix: next/prev folder list was not copied on area copies
** Leftover todos
- Cocoa windows seem to hang on cases still... needs check
- Cocoa 'draw overlap' swap doesn't work
- Cocoa window loses focus permanently on using Spotlight
(for these reasons, makefile building has Carbon as default atm)
- ListView templates in UI cannot become dragged yet, needs review...
it consists of two overlapping UI elements, preventing handling icon clicks.
- There's already Ghost library code to handle dropping from OS
into Blender window. I've noticed this code is unfinished for Macs, but
seems to be complete for Windows. Needs test... currently, an external
drop event will print in console when succesfully delivered to Blender's WM.
2010-01-26 18:18:21 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
but->dragtype = WM_DRAG_VALUE;
|
Drag and drop 2.5 integration! Finally, slashdot regulars can use
Blender too now! :)
** Drag works as follows:
- drag-able items are defined by the standard interface ui toolkit
- each button can get this feature, via uiButSetDragXXX(but, ...).
There are calls to define drag-able images, ID blocks, RNA paths,
file paths, and so on. By default you drag an icon, exceptionally
an ImBuf
- Drag items are registered centrally in the WM, it allows more drag
items simultaneous too, but not implemented
** Drop works as follows:
- On mouse release, and if drag items exist in the WM, it converts
the mouse event to an EVT_DROP type. This event then gets the full
drag info as customdata
- drop regions are defined with WM_dropbox_add(), similar to keymaps
you can make a "drop map" this way, which become 'drop map handlers'
in the queues.
- next to that the UI kit handles some common button types (like
accepting ID or names) to be catching a drop event too.
- Every "drop box" has two callbacks:
- poll() = check if the event drag data is relevant for this box
- copy() = fill in custom properties in the dropbox to initialize
an operator
- The dropbox handler then calls its standard Operator with its
dropbox properties.
** Currently implemented
Drag items:
- ID icons in browse buttons
- ID icons in context menu of properties region
- ID icons in outliner and rna viewer
- FileBrowser icons
- FileBrowser preview images
Drag-able icons are subtly visualized by making them brighter a bit
on mouse-over. In case the icon is a button or UI element too (most
cases), the drag-able feature will make the item react to
mouse-release instead of mouse-press.
Drop options:
- UI buttons: ID and text buttons (paste name)
- View3d: Object ID drop copies object
- View3d: Material ID drop assigns to object under cursor
- View3d: Image ID drop assigns to object UV texture under cursor
- Sequencer: Path drop will add either Image or Movie strip
- Image window: Path drop will open image
** Drag and drop Notes:
- Dropping into another Blender window (from same application) works
too. I've added code that passes on mousemoves and clicks to other
windows, without activating them though. This does make using multi-window
Blender a bit friendler.
- Dropping a file path to an image, is not the same as dropping an
Image ID... keep this in mind. Sequencer for example wants paths to
be dropped, textures in 3d window wants an Image ID.
- Although drop boxes could be defined via Python, I suggest they're
part of the UI and editor design (= how we want an editor to work), and
not default offered configurable like keymaps.
- At the moment only one item can be dragged at a time. This is for
several reasons.... For one, Blender doesn't have a well defined
uniform way to define "what is selected" (files, outliner items, etc).
Secondly there's potential conflicts on what todo when you drop mixed
drag sets on spots. All undefined stuff... nice for later.
- Example to bypass the above: a collection of images that form a strip,
should be represented in filewindow as a single sequence anyway.
This then will fit well and gets handled neatly by design.
- Another option to check is to allow multiple options per drop... it
could show the operator as a sort of menu, allowing arrow or scrollwheel
to choose. For time being I'd prefer to try to design a singular drop
though, just offer only one drop action per data type on given spots.
- What does work already, but a tad slow, is to use a function that
detects an object (type) under cursor, so a drag item's option can be
further refined (like drop object on object = parent). (disabled)
** More notes
- Added saving for Region layouts (like split points for toolbar)
- Label buttons now handle mouse over
- File list: added full path entry for drop feature.
- Filesel bugfix: wm_operator_exec() got called there and fully handled,
while WM event code tried same. Added new OPERATOR_HANDLED flag for this.
Maybe python needs it too?
- Cocoa: added window move event, so multi-win setups work OK (didnt save).
- Interface_handlers.c: removed win->active
- Severe area copy bug: area handlers were not set to NULL
- Filesel bugfix: next/prev folder list was not copied on area copies
** Leftover todos
- Cocoa windows seem to hang on cases still... needs check
- Cocoa 'draw overlap' swap doesn't work
- Cocoa window loses focus permanently on using Spotlight
(for these reasons, makefile building has Carbon as default atm)
- ListView templates in UI cannot become dragged yet, needs review...
it consists of two overlapping UI elements, preventing handling icon clicks.
- There's already Ghost library code to handle dropping from OS
into Blender window. I've noticed this code is unfinished for Macs, but
seems to be complete for Windows. Needs test... currently, an external
drop event will print in console when succesfully delivered to Blender's WM.
2010-01-26 18:18:21 +00:00
|
|
|
}
|
|
|
|
|
2015-08-10 18:01:11 +02:00
|
|
|
void UI_but_drag_set_image(uiBut *but, const char *path, int icon, struct ImBuf *imb, float scale, const bool use_free)
|
Drag and drop 2.5 integration! Finally, slashdot regulars can use
Blender too now! :)
** Drag works as follows:
- drag-able items are defined by the standard interface ui toolkit
- each button can get this feature, via uiButSetDragXXX(but, ...).
There are calls to define drag-able images, ID blocks, RNA paths,
file paths, and so on. By default you drag an icon, exceptionally
an ImBuf
- Drag items are registered centrally in the WM, it allows more drag
items simultaneous too, but not implemented
** Drop works as follows:
- On mouse release, and if drag items exist in the WM, it converts
the mouse event to an EVT_DROP type. This event then gets the full
drag info as customdata
- drop regions are defined with WM_dropbox_add(), similar to keymaps
you can make a "drop map" this way, which become 'drop map handlers'
in the queues.
- next to that the UI kit handles some common button types (like
accepting ID or names) to be catching a drop event too.
- Every "drop box" has two callbacks:
- poll() = check if the event drag data is relevant for this box
- copy() = fill in custom properties in the dropbox to initialize
an operator
- The dropbox handler then calls its standard Operator with its
dropbox properties.
** Currently implemented
Drag items:
- ID icons in browse buttons
- ID icons in context menu of properties region
- ID icons in outliner and rna viewer
- FileBrowser icons
- FileBrowser preview images
Drag-able icons are subtly visualized by making them brighter a bit
on mouse-over. In case the icon is a button or UI element too (most
cases), the drag-able feature will make the item react to
mouse-release instead of mouse-press.
Drop options:
- UI buttons: ID and text buttons (paste name)
- View3d: Object ID drop copies object
- View3d: Material ID drop assigns to object under cursor
- View3d: Image ID drop assigns to object UV texture under cursor
- Sequencer: Path drop will add either Image or Movie strip
- Image window: Path drop will open image
** Drag and drop Notes:
- Dropping into another Blender window (from same application) works
too. I've added code that passes on mousemoves and clicks to other
windows, without activating them though. This does make using multi-window
Blender a bit friendler.
- Dropping a file path to an image, is not the same as dropping an
Image ID... keep this in mind. Sequencer for example wants paths to
be dropped, textures in 3d window wants an Image ID.
- Although drop boxes could be defined via Python, I suggest they're
part of the UI and editor design (= how we want an editor to work), and
not default offered configurable like keymaps.
- At the moment only one item can be dragged at a time. This is for
several reasons.... For one, Blender doesn't have a well defined
uniform way to define "what is selected" (files, outliner items, etc).
Secondly there's potential conflicts on what todo when you drop mixed
drag sets on spots. All undefined stuff... nice for later.
- Example to bypass the above: a collection of images that form a strip,
should be represented in filewindow as a single sequence anyway.
This then will fit well and gets handled neatly by design.
- Another option to check is to allow multiple options per drop... it
could show the operator as a sort of menu, allowing arrow or scrollwheel
to choose. For time being I'd prefer to try to design a singular drop
though, just offer only one drop action per data type on given spots.
- What does work already, but a tad slow, is to use a function that
detects an object (type) under cursor, so a drag item's option can be
further refined (like drop object on object = parent). (disabled)
** More notes
- Added saving for Region layouts (like split points for toolbar)
- Label buttons now handle mouse over
- File list: added full path entry for drop feature.
- Filesel bugfix: wm_operator_exec() got called there and fully handled,
while WM event code tried same. Added new OPERATOR_HANDLED flag for this.
Maybe python needs it too?
- Cocoa: added window move event, so multi-win setups work OK (didnt save).
- Interface_handlers.c: removed win->active
- Severe area copy bug: area handlers were not set to NULL
- Filesel bugfix: next/prev folder list was not copied on area copies
** Leftover todos
- Cocoa windows seem to hang on cases still... needs check
- Cocoa 'draw overlap' swap doesn't work
- Cocoa window loses focus permanently on using Spotlight
(for these reasons, makefile building has Carbon as default atm)
- ListView templates in UI cannot become dragged yet, needs review...
it consists of two overlapping UI elements, preventing handling icon clicks.
- There's already Ghost library code to handle dropping from OS
into Blender window. I've noticed this code is unfinished for Macs, but
seems to be complete for Windows. Needs test... currently, an external
drop event will print in console when succesfully delivered to Blender's WM.
2010-01-26 18:18:21 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
but->dragtype = WM_DRAG_PATH;
|
2018-09-27 15:35:22 +02:00
|
|
|
ui_def_but_icon(but, icon, 0); /* no flag UI_HAS_ICON, so icon doesn't draw in button */
|
2015-08-10 18:01:11 +02:00
|
|
|
if ((but->dragflag & UI_BUT_DRAGPOIN_FREE)) {
|
|
|
|
MEM_SAFE_FREE(but->dragpoin);
|
|
|
|
but->dragflag &= ~UI_BUT_DRAGPOIN_FREE;
|
|
|
|
}
|
2012-03-30 01:51:25 +00:00
|
|
|
but->dragpoin = (void *)path;
|
2015-08-10 18:01:11 +02:00
|
|
|
if (use_free) {
|
|
|
|
but->dragflag |= UI_BUT_DRAGPOIN_FREE;
|
|
|
|
}
|
2012-03-30 01:51:25 +00:00
|
|
|
but->imb = imb;
|
|
|
|
but->imb_scale = scale;
|
Drag and drop 2.5 integration! Finally, slashdot regulars can use
Blender too now! :)
** Drag works as follows:
- drag-able items are defined by the standard interface ui toolkit
- each button can get this feature, via uiButSetDragXXX(but, ...).
There are calls to define drag-able images, ID blocks, RNA paths,
file paths, and so on. By default you drag an icon, exceptionally
an ImBuf
- Drag items are registered centrally in the WM, it allows more drag
items simultaneous too, but not implemented
** Drop works as follows:
- On mouse release, and if drag items exist in the WM, it converts
the mouse event to an EVT_DROP type. This event then gets the full
drag info as customdata
- drop regions are defined with WM_dropbox_add(), similar to keymaps
you can make a "drop map" this way, which become 'drop map handlers'
in the queues.
- next to that the UI kit handles some common button types (like
accepting ID or names) to be catching a drop event too.
- Every "drop box" has two callbacks:
- poll() = check if the event drag data is relevant for this box
- copy() = fill in custom properties in the dropbox to initialize
an operator
- The dropbox handler then calls its standard Operator with its
dropbox properties.
** Currently implemented
Drag items:
- ID icons in browse buttons
- ID icons in context menu of properties region
- ID icons in outliner and rna viewer
- FileBrowser icons
- FileBrowser preview images
Drag-able icons are subtly visualized by making them brighter a bit
on mouse-over. In case the icon is a button or UI element too (most
cases), the drag-able feature will make the item react to
mouse-release instead of mouse-press.
Drop options:
- UI buttons: ID and text buttons (paste name)
- View3d: Object ID drop copies object
- View3d: Material ID drop assigns to object under cursor
- View3d: Image ID drop assigns to object UV texture under cursor
- Sequencer: Path drop will add either Image or Movie strip
- Image window: Path drop will open image
** Drag and drop Notes:
- Dropping into another Blender window (from same application) works
too. I've added code that passes on mousemoves and clicks to other
windows, without activating them though. This does make using multi-window
Blender a bit friendler.
- Dropping a file path to an image, is not the same as dropping an
Image ID... keep this in mind. Sequencer for example wants paths to
be dropped, textures in 3d window wants an Image ID.
- Although drop boxes could be defined via Python, I suggest they're
part of the UI and editor design (= how we want an editor to work), and
not default offered configurable like keymaps.
- At the moment only one item can be dragged at a time. This is for
several reasons.... For one, Blender doesn't have a well defined
uniform way to define "what is selected" (files, outliner items, etc).
Secondly there's potential conflicts on what todo when you drop mixed
drag sets on spots. All undefined stuff... nice for later.
- Example to bypass the above: a collection of images that form a strip,
should be represented in filewindow as a single sequence anyway.
This then will fit well and gets handled neatly by design.
- Another option to check is to allow multiple options per drop... it
could show the operator as a sort of menu, allowing arrow or scrollwheel
to choose. For time being I'd prefer to try to design a singular drop
though, just offer only one drop action per data type on given spots.
- What does work already, but a tad slow, is to use a function that
detects an object (type) under cursor, so a drag item's option can be
further refined (like drop object on object = parent). (disabled)
** More notes
- Added saving for Region layouts (like split points for toolbar)
- Label buttons now handle mouse over
- File list: added full path entry for drop feature.
- Filesel bugfix: wm_operator_exec() got called there and fully handled,
while WM event code tried same. Added new OPERATOR_HANDLED flag for this.
Maybe python needs it too?
- Cocoa: added window move event, so multi-win setups work OK (didnt save).
- Interface_handlers.c: removed win->active
- Severe area copy bug: area handlers were not set to NULL
- Filesel bugfix: next/prev folder list was not copied on area copies
** Leftover todos
- Cocoa windows seem to hang on cases still... needs check
- Cocoa 'draw overlap' swap doesn't work
- Cocoa window loses focus permanently on using Spotlight
(for these reasons, makefile building has Carbon as default atm)
- ListView templates in UI cannot become dragged yet, needs review...
it consists of two overlapping UI elements, preventing handling icon clicks.
- There's already Ghost library code to handle dropping from OS
into Blender window. I've noticed this code is unfinished for Macs, but
seems to be complete for Windows. Needs test... currently, an external
drop event will print in console when succesfully delivered to Blender's WM.
2010-01-26 18:18:21 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
PointerRNA *UI_but_operator_ptr_get(uiBut *but)
|
2008-12-16 20:03:28 +00:00
|
|
|
{
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->optype && !but->opptr) {
|
2012-03-30 01:51:25 +00:00
|
|
|
but->opptr = MEM_callocN(sizeof(PointerRNA), "uiButOpPtr");
|
2009-11-24 16:19:15 +00:00
|
|
|
WM_operator_properties_create_ptr(but->opptr, but->optype);
|
2008-12-16 20:03:28 +00:00
|
|
|
}
|
|
|
|
|
|
|
|
return but->opptr;
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_but_unit_type_set(uiBut *but, const int unit_type)
|
2010-12-10 04:10:21 +00:00
|
|
|
{
|
2019-01-04 11:05:53 +11:00
|
|
|
but->unit_type = (uchar)(RNA_SUBTYPE_UNIT_VALUE(unit_type));
|
2010-12-10 04:10:21 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
int UI_but_unit_type_get(const uiBut *but)
|
2010-12-10 04:10:21 +00:00
|
|
|
{
|
Graph Editor "Active Keyframe" settings
This commit makes some tweaks to the way that the "active keyframe"
settings in the Properties region in the Graph Editor work (for the
better, hopefully).
Basically, the problem was that previously, these were clunky and non-
intuitive to use, since they were just directly displaying the RNA
properties for those keyframes for editing purposes. But due to
limitations of RNA (i.e. from a RNA pointer to a keyframe, you
couldn't see which F-Curve you came from), several things were
impossible, notably:
1) Doing proper updates, including validating that the handles are in
a valid state - that requires access to the F-Curve to provide to the
F-Curve-based curve validity checking functions
2) Having the values of the keyframes display in whatever unit that
the property the F-Curve affects displays as - for this, you once
again need to know the F-Curve in order to resolve the property that
it affects; also the fact that only a single unit could be set for RNA
properties further limited things
This commit basically gets around these problems by moving away from a
layout-engine based approach to one where we attach custom update
callbacks and also override the units of the y-co widgets when
creating the widgets for these, thus allowing the buttons to work in
the ways that animators expect.
2011-08-06 07:01:07 +00:00
|
|
|
int ownUnit = (int)but->unit_type;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
|
|
|
/* own unit define always takes precedence over RNA provided, allowing for overriding
|
Graph Editor "Active Keyframe" settings
This commit makes some tweaks to the way that the "active keyframe"
settings in the Properties region in the Graph Editor work (for the
better, hopefully).
Basically, the problem was that previously, these were clunky and non-
intuitive to use, since they were just directly displaying the RNA
properties for those keyframes for editing purposes. But due to
limitations of RNA (i.e. from a RNA pointer to a keyframe, you
couldn't see which F-Curve you came from), several things were
impossible, notably:
1) Doing proper updates, including validating that the handles are in
a valid state - that requires access to the F-Curve to provide to the
F-Curve-based curve validity checking functions
2) Having the values of the keyframes display in whatever unit that
the property the F-Curve affects displays as - for this, you once
again need to know the F-Curve in order to resolve the property that
it affects; also the fact that only a single unit could be set for RNA
properties further limited things
This commit basically gets around these problems by moving away from a
layout-engine based approach to one where we attach custom update
callbacks and also override the units of the y-co widgets when
creating the widgets for these, thus allowing the buttons to work in
the ways that animators expect.
2011-08-06 07:01:07 +00:00
|
|
|
* default value provided in RNA in a few special cases (i.e. Active Keyframe in Graph Edit)
|
|
|
|
*/
|
2012-07-07 22:51:57 +00:00
|
|
|
/* XXX: this doesn't allow clearing unit completely, though the same could be said for icons */
|
Graph Editor "Active Keyframe" settings
This commit makes some tweaks to the way that the "active keyframe"
settings in the Properties region in the Graph Editor work (for the
better, hopefully).
Basically, the problem was that previously, these were clunky and non-
intuitive to use, since they were just directly displaying the RNA
properties for those keyframes for editing purposes. But due to
limitations of RNA (i.e. from a RNA pointer to a keyframe, you
couldn't see which F-Curve you came from), several things were
impossible, notably:
1) Doing proper updates, including validating that the handles are in
a valid state - that requires access to the F-Curve to provide to the
F-Curve-based curve validity checking functions
2) Having the values of the keyframes display in whatever unit that
the property the F-Curve affects displays as - for this, you once
again need to know the F-Curve in order to resolve the property that
it affects; also the fact that only a single unit could be set for RNA
properties further limited things
This commit basically gets around these problems by moving away from a
layout-engine based approach to one where we attach custom update
callbacks and also override the units of the y-co widgets when
creating the widgets for these, thus allowing the buttons to work in
the ways that animators expect.
2011-08-06 07:01:07 +00:00
|
|
|
if ((ownUnit != 0) || (but->rnaprop == NULL)) {
|
|
|
|
return ownUnit << 16;
|
2010-12-10 04:10:21 +00:00
|
|
|
}
|
|
|
|
else {
|
Graph Editor "Active Keyframe" settings
This commit makes some tweaks to the way that the "active keyframe"
settings in the Properties region in the Graph Editor work (for the
better, hopefully).
Basically, the problem was that previously, these were clunky and non-
intuitive to use, since they were just directly displaying the RNA
properties for those keyframes for editing purposes. But due to
limitations of RNA (i.e. from a RNA pointer to a keyframe, you
couldn't see which F-Curve you came from), several things were
impossible, notably:
1) Doing proper updates, including validating that the handles are in
a valid state - that requires access to the F-Curve to provide to the
F-Curve-based curve validity checking functions
2) Having the values of the keyframes display in whatever unit that
the property the F-Curve affects displays as - for this, you once
again need to know the F-Curve in order to resolve the property that
it affects; also the fact that only a single unit could be set for RNA
properties further limited things
This commit basically gets around these problems by moving away from a
layout-engine based approach to one where we attach custom update
callbacks and also override the units of the y-co widgets when
creating the widgets for these, thus allowing the buttons to work in
the ways that animators expect.
2011-08-06 07:01:07 +00:00
|
|
|
return RNA_SUBTYPE_UNIT(RNA_property_subtype(but->rnaprop));
|
2010-12-10 04:10:21 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_func_handle_set(uiBlock *block, uiBlockHandleFunc func, void *arg)
|
2008-12-10 19:22:10 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
block->handle_func = func;
|
|
|
|
block->handle_func_arg = arg;
|
2008-12-10 19:22:10 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_func_butmenu_set(uiBlock *block, uiMenuHandleFunc func, void *arg)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
block->butm_func = func;
|
|
|
|
block->butm_func_arg = arg;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_func_set(uiBlock *block, uiButHandleFunc func, void *arg1, void *arg2)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
block->func = func;
|
|
|
|
block->func_arg1 = arg1;
|
|
|
|
block->func_arg2 = arg2;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_block_funcN_set(uiBlock *block, uiButHandleNFunc funcN, void *argN, void *arg2)
|
2009-09-16 18:47:42 +00:00
|
|
|
{
|
2012-01-21 22:00:40 +00:00
|
|
|
if (block->func_argN) {
|
2009-09-16 18:47:42 +00:00
|
|
|
MEM_freeN(block->func_argN);
|
2012-01-21 22:00:40 +00:00
|
|
|
}
|
2009-09-16 18:47:42 +00:00
|
|
|
|
2013-04-13 16:28:39 +00:00
|
|
|
block->funcN = funcN;
|
2012-03-30 01:51:25 +00:00
|
|
|
block->func_argN = argN;
|
|
|
|
block->func_arg2 = arg2;
|
2009-09-16 18:47:42 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_but_func_rename_set(uiBut *but, uiButHandleRenameFunc func, void *arg1)
|
2009-06-03 18:31:37 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
but->rename_func = func;
|
|
|
|
but->rename_arg1 = arg1;
|
2009-06-03 18:31:37 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_but_func_drawextra_set(uiBlock *block, void (*func)(const bContext *C, void *idv, void *arg1, void *arg2, rcti *rect), void *arg1, void *arg2)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
block->drawextra = func;
|
|
|
|
block->drawextra_arg1 = arg1;
|
|
|
|
block->drawextra_arg2 = arg2;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_but_func_set(uiBut *but, uiButHandleFunc func, void *arg1, void *arg2)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
but->func = func;
|
|
|
|
but->func_arg1 = arg1;
|
|
|
|
but->func_arg2 = arg2;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_but_funcN_set(uiBut *but, uiButHandleNFunc funcN, void *argN, void *arg2)
|
2.5: ID datablock button back, previously known as std_libbuttons. The
way this worked in 2.4x wasn't really clean, with events going all over
the place and using dubious variables such as G.but->lockpoin or
G.sima->menunr. It works as follows now, for example:
xco= uiDefIDPoinButs(block, CTX_data_main(C), NULL, (ID**)&sima->image, ID_IM, &sima->pin, xco, yco,
sima_idpoin_handle, UI_ID_BROWSE|UI_ID_RENAME|UI_ID_ADD_NEW|UI_ID_OPEN|UI_ID_DELETE|UI_ID_ALONE|UI_ID_PIN);
The last two parameters are a callback function, and a list of events
or functionalities that are supported. The callback function will then
get the ID pointer + event to handle.
2009-02-06 16:40:14 +00:00
|
|
|
{
|
2012-01-21 22:00:40 +00:00
|
|
|
if (but->func_argN) {
|
2009-09-16 18:47:42 +00:00
|
|
|
MEM_freeN(but->func_argN);
|
2012-01-21 22:00:40 +00:00
|
|
|
}
|
2009-09-16 18:47:42 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
but->funcN = funcN;
|
|
|
|
but->func_argN = argN;
|
|
|
|
but->func_arg2 = arg2;
|
2.5: ID datablock button back, previously known as std_libbuttons. The
way this worked in 2.4x wasn't really clean, with events going all over
the place and using dubious variables such as G.but->lockpoin or
G.sima->menunr. It works as follows now, for example:
xco= uiDefIDPoinButs(block, CTX_data_main(C), NULL, (ID**)&sima->image, ID_IM, &sima->pin, xco, yco,
sima_idpoin_handle, UI_ID_BROWSE|UI_ID_RENAME|UI_ID_ADD_NEW|UI_ID_OPEN|UI_ID_DELETE|UI_ID_ALONE|UI_ID_PIN);
The last two parameters are a callback function, and a list of events
or functionalities that are supported. The callback function will then
get the ID pointer + event to handle.
2009-02-06 16:40:14 +00:00
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_but_func_complete_set(uiBut *but, uiButCompleteFunc func, void *arg)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2012-03-30 01:51:25 +00:00
|
|
|
but->autocomplete_func = func;
|
|
|
|
but->autofunc_arg = arg;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2015-11-16 06:26:25 +11:00
|
|
|
void UI_but_func_menu_step_set(uiBut *but, uiMenuStepFunc func)
|
|
|
|
{
|
|
|
|
but->menu_step_func = func;
|
|
|
|
}
|
|
|
|
|
2015-02-11 00:06:03 +01:00
|
|
|
void UI_but_func_tooltip_set(uiBut *but, uiButToolTipFunc func, void *argN)
|
|
|
|
{
|
|
|
|
but->tip_func = func;
|
|
|
|
if (but->tip_argN) {
|
|
|
|
MEM_freeN(but->tip_argN);
|
|
|
|
}
|
|
|
|
but->tip_argN = argN;
|
|
|
|
}
|
|
|
|
|
2019-03-01 13:14:16 -03:00
|
|
|
void UI_but_func_pushed_state_set(uiBut *but, uiButPushedStateFunc func, void *arg)
|
|
|
|
{
|
|
|
|
but->pushed_state_func = func;
|
|
|
|
but->pushed_state_arg = arg;
|
|
|
|
}
|
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefBlockBut(uiBlock *block, uiBlockCreateFunc func, void *arg, const char *str, int x, int y, short width, short height, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2014-11-09 21:20:40 +01:00
|
|
|
uiBut *but = ui_def_but(block, UI_BTYPE_BLOCK, 0, str, x, y, width, height, arg, 0.0, 0.0, 0.0, 0.0, tip);
|
2012-03-30 01:51:25 +00:00
|
|
|
but->block_create_func = func;
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update(but);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
return but;
|
|
|
|
}
|
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefBlockButN(uiBlock *block, uiBlockCreateFunc func, void *argN, const char *str, int x, int y, short width, short height, const char *tip)
|
2009-06-10 11:43:21 +00:00
|
|
|
{
|
2014-11-09 21:20:40 +01:00
|
|
|
uiBut *but = ui_def_but(block, UI_BTYPE_BLOCK, 0, str, x, y, width, height, NULL, 0.0, 0.0, 0.0, 0.0, tip);
|
2012-03-30 01:51:25 +00:00
|
|
|
but->block_create_func = func;
|
2012-01-21 22:00:40 +00:00
|
|
|
if (but->func_argN) {
|
2009-09-16 18:47:42 +00:00
|
|
|
MEM_freeN(but->func_argN);
|
2012-01-21 22:00:40 +00:00
|
|
|
}
|
2012-03-30 01:51:25 +00:00
|
|
|
but->func_argN = argN;
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update(but);
|
2009-06-10 11:43:21 +00:00
|
|
|
return but;
|
|
|
|
}
|
|
|
|
|
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefPulldownBut(uiBlock *block, uiBlockCreateFunc func, void *arg, const char *str, int x, int y, short width, short height, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2014-11-09 21:20:40 +01:00
|
|
|
uiBut *but = ui_def_but(block, UI_BTYPE_PULLDOWN, 0, str, x, y, width, height, arg, 0.0, 0.0, 0.0, 0.0, tip);
|
2012-03-30 01:51:25 +00:00
|
|
|
but->block_create_func = func;
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update(but);
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
return but;
|
|
|
|
}
|
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefMenuBut(uiBlock *block, uiMenuCreateFunc func, void *arg, const char *str, int x, int y, short width, short height, const char *tip)
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
{
|
2014-11-09 21:20:40 +01:00
|
|
|
uiBut *but = ui_def_but(block, UI_BTYPE_PULLDOWN, 0, str, x, y, width, height, arg, 0.0, 0.0, 0.0, 0.0, tip);
|
2012-03-30 01:51:25 +00:00
|
|
|
but->menu_create_func = func;
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update(but);
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
return but;
|
|
|
|
}
|
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconTextMenuBut(uiBlock *block, uiMenuCreateFunc func, void *arg, int icon, const char *str, int x, int y, short width, short height, const char *tip)
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
{
|
2014-11-09 21:20:40 +01:00
|
|
|
uiBut *but = ui_def_but(block, UI_BTYPE_PULLDOWN, 0, str, x, y, width, height, arg, 0.0, 0.0, 0.0, 0.0, tip);
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
|
2015-05-11 16:29:12 +02:00
|
|
|
ui_def_but_icon(but, icon, UI_HAS_ICON);
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
|
2013-11-21 14:43:08 +01:00
|
|
|
but->drawflag |= UI_BUT_ICON_LEFT;
|
2018-07-06 19:26:12 +02:00
|
|
|
ui_but_submenu_enable(block, but);
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
but->menu_create_func = func;
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update(but);
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
return but;
|
|
|
|
}
|
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconMenuBut(uiBlock *block, uiMenuCreateFunc func, void *arg, int icon, int x, int y, short width, short height, const char *tip)
|
2009-09-28 15:59:09 +00:00
|
|
|
{
|
2014-11-09 21:20:40 +01:00
|
|
|
uiBut *but = ui_def_but(block, UI_BTYPE_PULLDOWN, 0, "", x, y, width, height, arg, 0.0, 0.0, 0.0, 0.0, tip);
|
2009-09-28 15:59:09 +00:00
|
|
|
|
2015-05-11 16:29:12 +02:00
|
|
|
ui_def_but_icon(but, icon, UI_HAS_ICON);
|
2013-11-21 14:43:08 +01:00
|
|
|
but->drawflag &= ~UI_BUT_ICON_LEFT;
|
2009-09-28 15:59:09 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
but->menu_create_func = func;
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update(but);
|
2009-09-28 15:59:09 +00:00
|
|
|
|
|
|
|
return but;
|
|
|
|
}
|
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
/* Block button containing both string label and icon */
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconTextBlockBut(uiBlock *block, uiBlockCreateFunc func, void *arg, int icon, const char *str, int x, int y, short width, short height, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2014-11-09 21:20:40 +01:00
|
|
|
uiBut *but = ui_def_but(block, UI_BTYPE_BLOCK, 0, str, x, y, width, height, arg, 0.0, 0.0, 0.0, 0.0, tip);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
/* XXX temp, old menu calls pass on icon arrow, which is now UI_BUT_ICON_SUBMENU flag */
|
2012-03-30 01:51:25 +00:00
|
|
|
if (icon != ICON_RIGHTARROW_THIN) {
|
2015-05-11 16:29:12 +02:00
|
|
|
ui_def_but_icon(but, icon, 0);
|
2013-11-21 14:43:08 +01:00
|
|
|
but->drawflag |= UI_BUT_ICON_LEFT;
|
2009-04-30 17:27:30 +00:00
|
|
|
}
|
2012-03-30 01:51:25 +00:00
|
|
|
but->flag |= UI_HAS_ICON;
|
2018-07-06 19:26:12 +02:00
|
|
|
ui_but_submenu_enable(block, but);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
but->block_create_func = func;
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update(but);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
return but;
|
|
|
|
}
|
|
|
|
|
|
|
|
/* Block button containing icon */
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconBlockBut(uiBlock *block, uiBlockCreateFunc func, void *arg, int retval, int icon, int x, int y, short width, short height, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2014-11-09 21:20:40 +01:00
|
|
|
uiBut *but = ui_def_but(block, UI_BTYPE_BLOCK, retval, "", x, y, width, height, arg, 0.0, 0.0, 0.0, 0.0, tip);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2015-05-11 16:29:12 +02:00
|
|
|
ui_def_but_icon(but, icon, UI_HAS_ICON);
|
|
|
|
|
2013-11-21 14:43:08 +01:00
|
|
|
but->drawflag |= UI_BUT_ICON_LEFT;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
but->block_create_func = func;
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update(but);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
return but;
|
|
|
|
}
|
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefKeyevtButS(uiBlock *block, int retval, const char *str, int x, int y, short width, short height, short *spoin, const char *tip)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2014-11-09 21:20:40 +01:00
|
|
|
uiBut *but = ui_def_but(block, UI_BTYPE_KEY_EVENT | UI_BUT_POIN_SHORT, retval, str, x, y, width, height, spoin, 0.0, 0.0, 0.0, 0.0, tip);
|
|
|
|
ui_but_update(but);
|
2009-09-21 16:39:07 +00:00
|
|
|
return but;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2009-07-28 16:48:02 +00:00
|
|
|
/* short pointers hardcoded */
|
|
|
|
/* modkeypoin will be set to KM_SHIFT, KM_ALT, KM_CTRL, KM_OSKEY bits */
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefHotKeyevtButS(uiBlock *block, int retval, const char *str, int x, int y, short width, short height, short *keypoin, short *modkeypoin, const char *tip)
|
2009-07-28 16:48:02 +00:00
|
|
|
{
|
2014-11-09 21:20:40 +01:00
|
|
|
uiBut *but = ui_def_but(block, UI_BTYPE_HOTKEY_EVENT | UI_BUT_POIN_SHORT, retval, str, x, y, width, height, keypoin, 0.0, 0.0, 0.0, 0.0, tip);
|
2012-03-30 01:51:25 +00:00
|
|
|
but->modifier_key = *modkeypoin;
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update(but);
|
2009-07-28 16:48:02 +00:00
|
|
|
return but;
|
|
|
|
}
|
|
|
|
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
/* arg is pointer to string/name, use UI_but_func_search_set() below to make this work */
|
2010-01-03 08:37:18 +00:00
|
|
|
/* here a1 and a2, if set, control thumbnail preview rows/cols */
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefSearchBut(uiBlock *block, void *arg, int retval, int icon, int maxlen, int x, int y, short width, short height, float a1, float a2, const char *tip)
|
2009-06-02 18:10:06 +00:00
|
|
|
{
|
2014-11-09 21:20:40 +01:00
|
|
|
uiBut *but = ui_def_but(block, UI_BTYPE_SEARCH_MENU, retval, "", x, y, width, height, arg, 0.0, maxlen, a1, a2, tip);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2015-05-11 16:29:12 +02:00
|
|
|
ui_def_but_icon(but, icon, UI_HAS_ICON);
|
|
|
|
|
2013-11-21 14:43:08 +01:00
|
|
|
but->drawflag |= UI_BUT_ICON_LEFT | UI_BUT_TEXT_LEFT;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_update(but);
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2009-06-02 18:10:06 +00:00
|
|
|
return but;
|
|
|
|
}
|
|
|
|
|
2010-11-29 15:25:06 +00:00
|
|
|
|
2015-05-31 14:20:03 +10:00
|
|
|
/**
|
2016-06-27 13:20:56 +10:00
|
|
|
* \param search_func, bfunc: both get it as \a arg.
|
2015-05-31 14:20:03 +10:00
|
|
|
* \param arg: user value,
|
|
|
|
* \param active: when set, button opens with this item visible and selected.
|
|
|
|
*/
|
2016-03-02 13:57:16 +11:00
|
|
|
void UI_but_func_search_set(
|
|
|
|
uiBut *but,
|
|
|
|
uiButSearchCreateFunc search_create_func,
|
|
|
|
uiButSearchFunc search_func, void *arg,
|
|
|
|
uiButHandleFunc bfunc, void *active)
|
2009-06-02 18:10:06 +00:00
|
|
|
{
|
2019-01-15 23:24:20 +11:00
|
|
|
/* needed since callers don't have access to internal functions
|
|
|
|
* (as an alternative we could expose it) */
|
2016-03-02 13:57:16 +11:00
|
|
|
if (search_create_func == NULL) {
|
|
|
|
search_create_func = ui_searchbox_create_generic;
|
|
|
|
}
|
|
|
|
|
|
|
|
but->search_create_func = search_create_func;
|
|
|
|
but->search_func = search_func;
|
2012-03-30 01:51:25 +00:00
|
|
|
but->search_arg = arg;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2016-03-07 16:38:22 +11:00
|
|
|
if (bfunc) {
|
|
|
|
#ifdef DEBUG
|
|
|
|
if (but->func) {
|
|
|
|
/* watch this, can be cause of much confusion, see: T47691 */
|
|
|
|
printf("%s: warning, overwriting button callback with search function callback!\n", __func__);
|
|
|
|
}
|
|
|
|
#endif
|
|
|
|
UI_but_func_set(but, bfunc, arg, active);
|
|
|
|
}
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2010-11-29 15:25:06 +00:00
|
|
|
/* search buttons show red-alert if item doesn't exist, not for menus */
|
2012-03-30 01:51:25 +00:00
|
|
|
if (0 == (but->block->flag & UI_BLOCK_LOOP)) {
|
2010-11-29 15:25:06 +00:00
|
|
|
/* skip empty buttons, not all buttons need input, we only show invalid */
|
2012-03-24 06:38:07 +00:00
|
|
|
if (but->drawstr[0])
|
2017-05-05 00:44:06 +02:00
|
|
|
ui_but_search_refresh(but);
|
2010-11-29 15:25:06 +00:00
|
|
|
}
|
2009-06-02 18:10:06 +00:00
|
|
|
}
|
|
|
|
|
2013-04-15 15:01:12 +00:00
|
|
|
/* Callbacks for operator search button. */
|
|
|
|
static void operator_enum_search_cb(const struct bContext *C, void *but, const char *str, uiSearchItems *items)
|
|
|
|
{
|
|
|
|
wmOperatorType *ot = ((uiBut *)but)->optype;
|
|
|
|
PropertyRNA *prop = ot->prop;
|
|
|
|
|
|
|
|
if (prop == NULL) {
|
|
|
|
printf("%s: %s has no enum property set\n",
|
|
|
|
__func__, ot->idname);
|
|
|
|
}
|
|
|
|
else if (RNA_property_type(prop) != PROP_ENUM) {
|
|
|
|
printf("%s: %s \"%s\" is not an enum property\n",
|
|
|
|
__func__, ot->idname, RNA_property_identifier(prop));
|
|
|
|
}
|
|
|
|
else {
|
2014-11-09 21:20:40 +01:00
|
|
|
PointerRNA *ptr = UI_but_operator_ptr_get(but); /* Will create it if needed! */
|
2017-10-18 15:07:26 +11:00
|
|
|
const EnumPropertyItem *item, *item_array;
|
2014-01-04 18:08:43 +11:00
|
|
|
bool do_free;
|
2013-04-15 15:01:12 +00:00
|
|
|
|
2016-01-29 15:05:51 +01:00
|
|
|
RNA_property_enum_items_gettexted((bContext *)C, ptr, prop, &item_array, NULL, &do_free);
|
2013-04-15 15:01:12 +00:00
|
|
|
|
|
|
|
for (item = item_array; item->identifier; item++) {
|
2019-01-15 23:24:20 +11:00
|
|
|
/* note: need to give the index rather than the
|
|
|
|
* identifier because the enum can be freed */
|
2013-04-15 15:01:12 +00:00
|
|
|
if (BLI_strcasestr(item->name, str)) {
|
2018-09-19 12:14:36 +10:00
|
|
|
if (false == UI_search_item_add(items, item->name, POINTER_FROM_INT(item->value), item->icon))
|
2013-04-15 15:01:12 +00:00
|
|
|
break;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2017-10-18 15:07:26 +11:00
|
|
|
if (do_free) {
|
|
|
|
MEM_freeN((void *)item_array);
|
|
|
|
}
|
2013-04-15 15:01:12 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
static void operator_enum_call_cb(struct bContext *UNUSED(C), void *but, void *arg2)
|
|
|
|
{
|
|
|
|
wmOperatorType *ot = ((uiBut *)but)->optype;
|
2014-11-09 21:20:40 +01:00
|
|
|
PointerRNA *opptr = UI_but_operator_ptr_get(but); /* Will create it if needed! */
|
2013-04-15 15:01:12 +00:00
|
|
|
|
|
|
|
if (ot) {
|
|
|
|
if (ot->prop) {
|
2018-09-19 12:05:58 +10:00
|
|
|
RNA_property_enum_set(opptr, ot->prop, POINTER_AS_INT(arg2));
|
2013-04-15 15:01:12 +00:00
|
|
|
/* We do not call op from here, will be called by button code.
|
|
|
|
* ui_apply_but_funcs_after() (in interface_handlers.c) called this func before checking operators,
|
|
|
|
* because one of its parameters is the button itself!
|
|
|
|
*/
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
printf("%s: op->prop for '%s' is NULL\n", __func__, ot->idname);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2015-05-31 14:20:03 +10:00
|
|
|
/**
|
|
|
|
* Same parameters as for uiDefSearchBut, with additional operator type and properties, used by callback
|
|
|
|
* to call again the right op with the right options (properties values).
|
|
|
|
*/
|
2015-05-05 03:13:47 +10:00
|
|
|
uiBut *uiDefSearchButO_ptr(
|
|
|
|
uiBlock *block, wmOperatorType *ot, IDProperty *properties,
|
|
|
|
void *arg, int retval, int icon, int maxlen, int x, int y,
|
|
|
|
short width, short height, float a1, float a2, const char *tip)
|
2013-04-15 15:01:12 +00:00
|
|
|
{
|
|
|
|
uiBut *but;
|
|
|
|
|
|
|
|
but = uiDefSearchBut(block, arg, retval, icon, maxlen, x, y, width, height, a1, a2, tip);
|
2016-03-02 13:57:16 +11:00
|
|
|
UI_but_func_search_set(
|
2016-03-02 17:57:16 +05:00
|
|
|
but, ui_searchbox_create_generic, operator_enum_search_cb,
|
2016-03-02 13:57:16 +11:00
|
|
|
but, operator_enum_call_cb, NULL);
|
2013-04-15 15:01:12 +00:00
|
|
|
|
|
|
|
but->optype = ot;
|
|
|
|
but->opcontext = WM_OP_EXEC_DEFAULT;
|
|
|
|
|
|
|
|
if (properties) {
|
2014-11-09 21:20:40 +01:00
|
|
|
PointerRNA *ptr = UI_but_operator_ptr_get(but);
|
2013-04-19 13:03:30 +00:00
|
|
|
/* Copy idproperties. */
|
2013-04-19 12:14:15 +00:00
|
|
|
ptr->data = IDP_CopyProperty(properties);
|
2013-04-15 15:01:12 +00:00
|
|
|
}
|
|
|
|
|
|
|
|
return but;
|
|
|
|
}
|
|
|
|
|
2015-05-31 14:20:03 +10:00
|
|
|
/**
|
|
|
|
* push a new event onto event queue to activate the given button
|
2011-02-15 01:24:12 +00:00
|
|
|
* (usually a text-field) upon entering a popup
|
|
|
|
*/
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_but_focus_on_enter_event(wmWindow *win, uiBut *but)
|
2011-02-15 01:24:12 +00:00
|
|
|
{
|
|
|
|
wmEvent event;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2013-07-29 08:29:04 +00:00
|
|
|
wm_event_init_from_window(win, &event);
|
|
|
|
|
2012-03-30 01:51:25 +00:00
|
|
|
event.type = EVT_BUT_OPEN;
|
|
|
|
event.val = KM_PRESS;
|
|
|
|
event.customdata = but;
|
2014-01-04 17:16:19 +11:00
|
|
|
event.customdatafree = false;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2011-02-15 01:24:12 +00:00
|
|
|
wm_event_add(win, &event);
|
|
|
|
}
|
|
|
|
|
2017-11-02 04:30:07 +11:00
|
|
|
void UI_but_func_hold_set(uiBut *but, uiButHandleHoldFunc func, void *argN)
|
|
|
|
{
|
|
|
|
but->hold_func = func;
|
|
|
|
but->hold_argN = argN;
|
|
|
|
}
|
|
|
|
|
2014-11-09 21:20:40 +01:00
|
|
|
void UI_but_string_info_get(bContext *C, uiBut *but, ...)
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
{
|
|
|
|
va_list args;
|
2012-12-02 04:51:15 +00:00
|
|
|
uiStringInfo *si;
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
|
2017-10-18 15:07:26 +11:00
|
|
|
const EnumPropertyItem *items = NULL, *item = NULL;
|
2014-01-04 18:08:43 +11:00
|
|
|
int totitems;
|
|
|
|
bool free_items = false;
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
|
2012-12-02 04:51:15 +00:00
|
|
|
va_start(args, but);
|
|
|
|
while ((si = (uiStringInfo *) va_arg(args, void *))) {
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
int type = si->type;
|
|
|
|
char *tmp = NULL;
|
|
|
|
|
|
|
|
if (type == BUT_GET_LABEL) {
|
|
|
|
if (but->str) {
|
2015-06-03 14:57:14 +10:00
|
|
|
const char *str_sep;
|
|
|
|
size_t str_len;
|
|
|
|
|
|
|
|
if ((but->flag & UI_BUT_HAS_SEP_CHAR) &&
|
|
|
|
(str_sep = strrchr(but->str, UI_SEP_CHAR)))
|
|
|
|
{
|
|
|
|
str_len = (str_sep - but->str);
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
str_len = strlen(but->str);
|
|
|
|
}
|
|
|
|
|
|
|
|
tmp = BLI_strdupn(but->str, str_len);
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
}
|
2014-02-15 11:35:40 +11:00
|
|
|
else {
|
2012-07-29 00:20:28 +00:00
|
|
|
type = BUT_GET_RNA_LABEL; /* Fail-safe solution... */
|
2014-02-15 11:35:40 +11:00
|
|
|
}
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
}
|
|
|
|
else if (type == BUT_GET_TIP) {
|
2015-02-11 00:06:03 +01:00
|
|
|
if (but->tip_func) {
|
|
|
|
tmp = but->tip_func(C, but->tip_argN, but->tip);
|
|
|
|
}
|
|
|
|
else if (but->tip && but->tip[0])
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
tmp = BLI_strdup(but->tip);
|
|
|
|
else
|
2012-07-29 00:20:28 +00:00
|
|
|
type = BUT_GET_RNA_TIP; /* Fail-safe solution... */
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
}
|
2012-07-26 16:55:34 +00:00
|
|
|
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
if (type == BUT_GET_RNAPROP_IDENTIFIER) {
|
|
|
|
if (but->rnaprop)
|
|
|
|
tmp = BLI_strdup(RNA_property_identifier(but->rnaprop));
|
|
|
|
}
|
|
|
|
else if (type == BUT_GET_RNASTRUCT_IDENTIFIER) {
|
2017-05-12 01:42:42 +02:00
|
|
|
if (but->rnaprop && but->rnapoin.data)
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
tmp = BLI_strdup(RNA_struct_identifier(but->rnapoin.type));
|
|
|
|
else if (but->optype)
|
|
|
|
tmp = BLI_strdup(but->optype->idname);
|
2014-11-09 21:20:40 +01:00
|
|
|
else if (ELEM(but->type, UI_BTYPE_MENU, UI_BTYPE_PULLDOWN)) {
|
|
|
|
MenuType *mt = UI_but_menutype_get(but);
|
2018-12-19 21:49:04 +11:00
|
|
|
if (mt) {
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
tmp = BLI_strdup(mt->idname);
|
2018-12-19 21:49:04 +11:00
|
|
|
}
|
|
|
|
}
|
|
|
|
else if (but->type == UI_BTYPE_POPOVER) {
|
|
|
|
PanelType *pt = UI_but_paneltype_get(but);
|
|
|
|
if (pt) {
|
|
|
|
tmp = BLI_strdup(pt->idname);
|
|
|
|
}
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
else if (ELEM(type, BUT_GET_RNA_LABEL, BUT_GET_RNA_TIP)) {
|
|
|
|
if (but->rnaprop) {
|
|
|
|
if (type == BUT_GET_RNA_LABEL)
|
|
|
|
tmp = BLI_strdup(RNA_property_ui_name(but->rnaprop));
|
|
|
|
else {
|
|
|
|
const char *t = RNA_property_ui_description(but->rnaprop);
|
|
|
|
if (t && t[0])
|
|
|
|
tmp = BLI_strdup(t);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
else if (but->optype) {
|
|
|
|
if (type == BUT_GET_RNA_LABEL)
|
|
|
|
tmp = BLI_strdup(RNA_struct_ui_name(but->optype->srna));
|
|
|
|
else {
|
|
|
|
const char *t = RNA_struct_ui_description(but->optype->srna);
|
|
|
|
if (t && t[0])
|
|
|
|
tmp = BLI_strdup(t);
|
|
|
|
}
|
|
|
|
}
|
2014-11-09 21:20:40 +01:00
|
|
|
else if (ELEM(but->type, UI_BTYPE_MENU, UI_BTYPE_PULLDOWN)) {
|
|
|
|
MenuType *mt = UI_but_menutype_get(but);
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
if (mt) {
|
2012-07-26 16:55:34 +00:00
|
|
|
/* not all menus are from python */
|
|
|
|
if (mt->ext.srna) {
|
|
|
|
if (type == BUT_GET_RNA_LABEL)
|
|
|
|
tmp = BLI_strdup(RNA_struct_ui_name(mt->ext.srna));
|
|
|
|
else {
|
|
|
|
const char *t = RNA_struct_ui_description(mt->ext.srna);
|
|
|
|
if (t && t[0])
|
|
|
|
tmp = BLI_strdup(t);
|
|
|
|
}
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
else if (type == BUT_GET_RNA_LABEL_CONTEXT) {
|
2015-08-16 17:32:01 +10:00
|
|
|
const char *_tmp = BLT_I18NCONTEXT_DEFAULT;
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
if (but->rnaprop)
|
2013-02-24 08:46:01 +00:00
|
|
|
_tmp = RNA_property_translation_context(but->rnaprop);
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
else if (but->optype)
|
2013-02-24 08:46:01 +00:00
|
|
|
_tmp = RNA_struct_translation_context(but->optype->srna);
|
2014-11-09 21:20:40 +01:00
|
|
|
else if (ELEM(but->type, UI_BTYPE_MENU, UI_BTYPE_PULLDOWN)) {
|
|
|
|
MenuType *mt = UI_but_menutype_get(but);
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
if (mt)
|
2013-02-24 08:46:01 +00:00
|
|
|
_tmp = RNA_struct_translation_context(mt->ext.srna);
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
}
|
2015-08-16 17:32:01 +10:00
|
|
|
if (BLT_is_default_context(_tmp)) {
|
|
|
|
_tmp = BLT_I18NCONTEXT_DEFAULT_BPYRNA;
|
2013-02-24 08:46:01 +00:00
|
|
|
}
|
|
|
|
tmp = BLI_strdup(_tmp);
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
}
|
2014-07-20 01:30:29 +10:00
|
|
|
else if (ELEM(type, BUT_GET_RNAENUM_IDENTIFIER, BUT_GET_RNAENUM_LABEL, BUT_GET_RNAENUM_TIP)) {
|
2012-10-08 05:57:52 +00:00
|
|
|
PointerRNA *ptr = NULL;
|
|
|
|
PropertyRNA *prop = NULL;
|
|
|
|
int value = 0;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2012-10-08 05:57:52 +00:00
|
|
|
/* get the enum property... */
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
if (but->rnaprop && RNA_property_type(but->rnaprop) == PROP_ENUM) {
|
2012-10-08 05:57:52 +00:00
|
|
|
/* enum property */
|
|
|
|
ptr = &but->rnapoin;
|
|
|
|
prop = but->rnaprop;
|
2018-10-30 23:20:08 +01:00
|
|
|
value = (ELEM(but->type, UI_BTYPE_ROW, UI_BTYPE_TAB)) ? (int)but->hardmax : (int)ui_but_value_get(but);
|
2012-10-08 05:57:52 +00:00
|
|
|
}
|
|
|
|
else if (but->optype) {
|
2014-11-09 21:20:40 +01:00
|
|
|
PointerRNA *opptr = UI_but_operator_ptr_get(but);
|
2012-10-08 05:57:52 +00:00
|
|
|
wmOperatorType *ot = but->optype;
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2018-05-25 14:01:25 +02:00
|
|
|
/* so the context is passed to itemf functions */
|
|
|
|
WM_operator_properties_sanitize(opptr, false);
|
|
|
|
|
2018-05-23 10:47:12 +02:00
|
|
|
/* if the default property of the operator is enum and it is set,
|
2012-10-08 05:57:52 +00:00
|
|
|
* fetch the tooltip of the selected value so that "Snap" and "Mirror"
|
|
|
|
* operator menus in the Anim Editors will show tooltips for the different
|
|
|
|
* operations instead of the meaningless generic operator tooltip
|
|
|
|
*/
|
|
|
|
if (ot->prop && RNA_property_type(ot->prop) == PROP_ENUM) {
|
|
|
|
if (RNA_struct_contains_property(opptr, ot->prop)) {
|
|
|
|
ptr = opptr;
|
|
|
|
prop = ot->prop;
|
|
|
|
value = RNA_property_enum_get(opptr, ot->prop);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
2018-05-23 10:47:12 +02:00
|
|
|
|
2012-10-08 05:57:52 +00:00
|
|
|
/* get strings from matching enum item */
|
|
|
|
if (ptr && prop) {
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
if (!item) {
|
|
|
|
int i;
|
Main Workspace Integration
This commit does the main integration of workspaces, which is a design we agreed on during the 2.8 UI workshop (see https://wiki.blender.org/index.php/Dev:2.8/UI/Workshop_Writeup)
Workspaces should generally be stable, I'm not aware of any remaining bugs (or I've forgotten them :) ). If you find any, let me know!
(Exception: mode switching button might get out of sync with actual mode in some cases, would consider that a limitation/ToDo. Needs to be resolved at some point.)
== Main Changes/Features
* Introduces the new Workspaces as data-blocks.
* Allow storing a number of custom workspaces as part of the user configuration. Needs further work to allow adding and deleting individual workspaces.
* Bundle a default workspace configuration with Blender (current screen-layouts converted to workspaces).
* Pressing button to add a workspace spawns a menu to select between "Duplicate Current" and the workspaces from the user configuration. If no workspaces are stored in the user configuration, the default workspaces are listed instead.
* Store screen-layouts (`bScreen`) per workspace.
* Store an active screen-layout per workspace. Changing the workspace will enable this layout.
* Store active mode in workspace. Changing the workspace will also enter the mode of the new workspace. (Note that we still store the active mode in the object, moving this completely to workspaces is a separate project.)
* Store an active render layer per workspace.
* Moved mode switch from 3D View header to Info Editor header.
* Store active scene in window (not directly workspace related, but overlaps quite a bit).
* Removed 'Use Global Scene' User Preference option.
* Compatibility with old files - a new workspace is created for every screen-layout of old files. Old Blender versions should be able to read files saved with workspace support as well.
* Default .blend only contains one workspace ("General").
* Support appending workspaces.
Opening files without UI and commandline rendering should work fine.
Note that the UI is temporary! We plan to introduce a new global topbar
that contains the workspace options and tabs for switching workspaces.
== Technical Notes
* Workspaces are data-blocks.
* Adding and removing `bScreen`s should be done through `ED_workspace_layout` API now.
* A workspace can be active in multiple windows at the same time.
* The mode menu (which is now in the Info Editor header) doesn't display "Grease Pencil Edit" mode anymore since its availability depends on the active editor. Will be fixed by making Grease Pencil an own object type (as planned).
* The button to change the active workspace object mode may get out of sync with the mode of the active object. Will either be resolved by moving mode out of object data, or we'll disable workspace modes again (there's a `#define USE_WORKSPACE_MODE` for that).
* Screen-layouts (`bScreen`) are IDs and thus stored in a main list-base. Had to add a wrapper `WorkSpaceLayout` so we can store them in a list-base within workspaces, too. On the long run we could completely replace `bScreen` by workspace structs.
* `WorkSpace` types use some special compiler trickery to allow marking structs and struct members as private. BKE_workspace API should be used for accessing those.
* Added scene operators `SCENE_OT_`. Was previously done through screen operators.
== BPY API Changes
* Removed `Screen.scene`, added `Window.scene`
* Removed `UserPreferencesView.use_global_scene`
* Added `Context.workspace`, `Window.workspace` and `BlendData.workspaces`
* Added `bpy.types.WorkSpace` containing `screens`, `object_mode` and `render_layer`
* Added Screen.layout_name for the layout name that'll be displayed in the UI (may differ from internal name)
== What's left?
* There are a few open design questions (T50521). We should find the needed answers and implement them.
* Allow adding and removing individual workspaces from workspace configuration (needs UI design).
* Get the override system ready and support overrides per workspace.
* Support custom UI setups as part of workspaces (hidden panels, hidden buttons, customizable toolbars, etc).
* Allow enabling add-ons per workspace.
* Support custom workspace keymaps.
* Remove special exception for workspaces in linking code (so they're always appended, never linked). Depends on a few things, so best to solve later.
* Get the topbar done.
* Workspaces need a proper icon, current one is just a placeholder :)
Reviewed By: campbellbarton, mont29
Tags: #user_interface, #bf_blender_2.8
Maniphest Tasks: T50521
Differential Revision: https://developer.blender.org/D2451
2017-06-01 19:56:58 +02:00
|
|
|
|
2012-10-08 05:57:52 +00:00
|
|
|
RNA_property_enum_items_gettexted(C, ptr, prop, &items, &totitems, &free_items);
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
for (i = 0, item = items; i < totitems; i++, item++) {
|
|
|
|
if (item->identifier[0] && item->value == value)
|
|
|
|
break;
|
|
|
|
}
|
|
|
|
}
|
2012-07-12 07:15:32 +00:00
|
|
|
if (item && item->identifier) {
|
|
|
|
if (type == BUT_GET_RNAENUM_IDENTIFIER)
|
|
|
|
tmp = BLI_strdup(item->identifier);
|
|
|
|
else if (type == BUT_GET_RNAENUM_LABEL)
|
|
|
|
tmp = BLI_strdup(item->name);
|
|
|
|
else if (item->description && item->description[0])
|
|
|
|
tmp = BLI_strdup(item->description);
|
|
|
|
}
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
else if (type == BUT_GET_OP_KEYMAP) {
|
2013-06-02 15:58:43 +00:00
|
|
|
if (!ui_block_is_menu(but->block)) {
|
|
|
|
char buf[128];
|
|
|
|
if (ui_but_event_operator_string(C, but, buf, sizeof(buf))) {
|
|
|
|
tmp = BLI_strdup(buf);
|
|
|
|
}
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
}
|
|
|
|
}
|
2013-07-07 12:53:15 +00:00
|
|
|
else if (type == BUT_GET_PROP_KEYMAP) {
|
|
|
|
/* for properties that are bound to one of the context cycle, etc. keys... */
|
|
|
|
char buf[128];
|
|
|
|
if (ui_but_event_property_operator_string(C, but, buf, sizeof(buf))) {
|
|
|
|
tmp = BLI_strdup(buf);
|
|
|
|
}
|
|
|
|
}
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
|
|
|
|
si->strinfo = tmp;
|
|
|
|
}
|
2013-07-13 05:46:48 +00:00
|
|
|
va_end(args);
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
|
2017-10-18 15:07:26 +11:00
|
|
|
if (free_items && items) {
|
|
|
|
MEM_freeN((void *)items);
|
|
|
|
}
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
}
|
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
/* Program Init/Exit */
|
|
|
|
|
|
|
|
void UI_init(void)
|
|
|
|
{
|
|
|
|
ui_resources_init();
|
|
|
|
}
|
|
|
|
|
2.5
Summary of ain features:
- Themes and Styles are now editable.
- CTRL+U "Save user defaults" now goes to new .B25.blend, so you
can use 2.4x and 2.5x next to each other. If B25 doesn't exist, it
reads the regular .B.blend
- Press Tkey in 3d window for (unfinished) toolbar WIP. It now only
shows the last operator, if appropriate.
Nkey properties moved to the other side.
A lot of work was done on removing old themes for good and properly
getting it work with the 2.5 region system. Here's some notes;
- Buttons now all have a complete set of colors, based on button classifications
(See outliner -> user prefs -> Interface
- Theme colors have been extended with basic colors for region types.
Currently colors are defined for Window, Header, List/Channels and
for Button/Tool views.
The screen manager handles this btw, so a TH_BACK will always pick the
right backdrop color.
- Menu backdrops are in in Button theme colors. Floating Panels will be in
the per-space type Themes.
- Styles were added in RNA too, but only for the font settings now.
Only Panel font, widget font and widget-label work now. The 'group label'
will be for templates mostly.
Style settings will be expanded with spacing defaults, label conventions,
etc.
- Label text colors are stored in per-space Theme too, to make sure they fit.
Same goes for Panel title color.
Note that 'shadow' for fonts can conflict with text colors; shadow color is
currently stored in Style... shadow code needs a bit of work still.
2009-04-27 13:44:11 +00:00
|
|
|
/* after reading userdef file */
|
2018-06-08 12:16:37 +02:00
|
|
|
void UI_init_userdef(Main *bmain)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
{
|
2.5
Summary of ain features:
- Themes and Styles are now editable.
- CTRL+U "Save user defaults" now goes to new .B25.blend, so you
can use 2.4x and 2.5x next to each other. If B25 doesn't exist, it
reads the regular .B.blend
- Press Tkey in 3d window for (unfinished) toolbar WIP. It now only
shows the last operator, if appropriate.
Nkey properties moved to the other side.
A lot of work was done on removing old themes for good and properly
getting it work with the 2.5 region system. Here's some notes;
- Buttons now all have a complete set of colors, based on button classifications
(See outliner -> user prefs -> Interface
- Theme colors have been extended with basic colors for region types.
Currently colors are defined for Window, Header, List/Channels and
for Button/Tool views.
The screen manager handles this btw, so a TH_BACK will always pick the
right backdrop color.
- Menu backdrops are in in Button theme colors. Floating Panels will be in
the per-space type Themes.
- Styles were added in RNA too, but only for the font settings now.
Only Panel font, widget font and widget-label work now. The 'group label'
will be for templates mostly.
Style settings will be expanded with spacing defaults, label conventions,
etc.
- Label text colors are stored in per-space Theme too, to make sure they fit.
Same goes for Panel title color.
Note that 'shadow' for fonts can conflict with text colors; shadow color is
currently stored in Style... shadow code needs a bit of work still.
2009-04-27 13:44:11 +00:00
|
|
|
/* fix saved themes */
|
2018-06-08 12:16:37 +02:00
|
|
|
init_userdef_do_versions(bmain);
|
2.5
Summary of ain features:
- Themes and Styles are now editable.
- CTRL+U "Save user defaults" now goes to new .B25.blend, so you
can use 2.4x and 2.5x next to each other. If B25 doesn't exist, it
reads the regular .B.blend
- Press Tkey in 3d window for (unfinished) toolbar WIP. It now only
shows the last operator, if appropriate.
Nkey properties moved to the other side.
A lot of work was done on removing old themes for good and properly
getting it work with the 2.5 region system. Here's some notes;
- Buttons now all have a complete set of colors, based on button classifications
(See outliner -> user prefs -> Interface
- Theme colors have been extended with basic colors for region types.
Currently colors are defined for Window, Header, List/Channels and
for Button/Tool views.
The screen manager handles this btw, so a TH_BACK will always pick the
right backdrop color.
- Menu backdrops are in in Button theme colors. Floating Panels will be in
the per-space type Themes.
- Styles were added in RNA too, but only for the font settings now.
Only Panel font, widget font and widget-label work now. The 'group label'
will be for templates mostly.
Style settings will be expanded with spacing defaults, label conventions,
etc.
- Label text colors are stored in per-space Theme too, to make sure they fit.
Same goes for Panel title color.
Note that 'shadow' for fonts can conflict with text colors; shadow color is
currently stored in Style... shadow code needs a bit of work still.
2009-04-27 13:44:11 +00:00
|
|
|
uiStyleInit();
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|
|
|
|
|
2011-11-14 16:05:44 +00:00
|
|
|
void UI_reinit_font(void)
|
2011-09-21 15:15:30 +00:00
|
|
|
{
|
|
|
|
uiStyleInit();
|
|
|
|
}
|
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
void UI_exit(void)
|
|
|
|
{
|
|
|
|
ui_resources_free();
|
2014-11-09 21:20:40 +01:00
|
|
|
ui_but_clipboard_free();
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
}
|