2011-02-21 07:25:24 +00:00
|
|
|
/*
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
* ***** BEGIN GPL LICENSE BLOCK *****
|
|
|
|
*
|
|
|
|
* This program is free software; you can redistribute it and/or
|
|
|
|
* modify it under the terms of the GNU General Public License
|
|
|
|
* as published by the Free Software Foundation; either version 2
|
|
|
|
* of the License, or (at your option) any later version.
|
|
|
|
*
|
|
|
|
* This program is distributed in the hope that it will be useful,
|
|
|
|
* but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
|
|
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
|
|
* GNU General Public License for more details.
|
|
|
|
*
|
|
|
|
* You should have received a copy of the GNU General Public License
|
|
|
|
* along with this program; if not, write to the Free Software Foundation,
|
2010-02-12 13:34:04 +00:00
|
|
|
* Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
*
|
|
|
|
* The Original Code is Copyright (C) 2001-2002 by NaN Holding BV.
|
|
|
|
* All rights reserved.
|
|
|
|
*
|
|
|
|
* The Original Code is: all of this file.
|
|
|
|
*
|
|
|
|
* Contributor(s): none yet.
|
|
|
|
*
|
|
|
|
* ***** END GPL LICENSE BLOCK *****
|
|
|
|
*/
|
|
|
|
|
2011-02-21 07:25:24 +00:00
|
|
|
/** \file UI_interface.h
|
|
|
|
* \ingroup editorui
|
|
|
|
*/
|
|
|
|
|
2012-02-17 18:59:41 +00:00
|
|
|
#ifndef __UI_INTERFACE_H__
|
|
|
|
#define __UI_INTERFACE_H__
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2013-09-01 15:01:15 +00:00
|
|
|
#include "BLI_compiler_attrs.h"
|
2013-05-28 19:35:26 +00:00
|
|
|
#include "BLI_sys_types.h" /* size_t */
|
2009-07-21 01:26:17 +00:00
|
|
|
#include "RNA_types.h"
|
Code holiday commit:
- fix: user pref, window title was reset to 'Blender' on tab usage
- Undo history menu back:
- name "Undo History"
- hotkey alt+ctrl+z (alt+apple+z for mac)
- works like 2.4x, only for global undo, editmode and particle edit.
- Menu scroll
- for small windows or screens, popup menus now allow to display
all items, using internal scrolling
- works with a timer, scrolling 10 items per second when mouse
is over the top or bottom arrow
- if menu is too big to display, it now draws to top or bottom,
based on largest available space.
- also works for hotkey driven pop up menus.
- User pref "DPI" follows widget/layout size
- widgets & headers now become bigger and smaller, to match
'dpi' font sizes. Works well to match UI to monitor size.
- note that icons can get fuzzy, we need better mipmaps for it
2011-06-04 17:03:46 +00:00
|
|
|
#include "DNA_userdef_types.h"
|
2009-07-21 01:26:17 +00:00
|
|
|
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
/* Struct Declarations */
|
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
struct ID;
|
2.5: ID datablock button back, previously known as std_libbuttons. The
way this worked in 2.4x wasn't really clean, with events going all over
the place and using dubious variables such as G.but->lockpoin or
G.sima->menunr. It works as follows now, for example:
xco= uiDefIDPoinButs(block, CTX_data_main(C), NULL, (ID**)&sima->image, ID_IM, &sima->pin, xco, yco,
sima_idpoin_handle, UI_ID_BROWSE|UI_ID_RENAME|UI_ID_ADD_NEW|UI_ID_OPEN|UI_ID_DELETE|UI_ID_ALONE|UI_ID_PIN);
The last two parameters are a callback function, and a list of events
or functionalities that are supported. The callback function will then
get the ID pointer + event to handle.
2009-02-06 16:40:14 +00:00
|
|
|
struct Main;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
struct ListBase;
|
|
|
|
struct ARegion;
|
2013-01-09 06:00:33 +00:00
|
|
|
struct ARegionType;
|
2009-03-13 13:38:41 +00:00
|
|
|
struct ScrArea;
|
2013-12-17 03:21:55 +11:00
|
|
|
struct wmEvent;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
struct wmWindow;
|
|
|
|
struct wmWindowManager;
|
2008-12-24 14:52:17 +00:00
|
|
|
struct wmOperator;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
struct AutoComplete;
|
|
|
|
struct bContext;
|
2012-04-03 15:18:59 +00:00
|
|
|
struct bContextStore;
|
2009-03-29 19:44:39 +00:00
|
|
|
struct Panel;
|
2009-04-11 01:52:27 +00:00
|
|
|
struct PanelType;
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
struct PointerRNA;
|
|
|
|
struct PropertyRNA;
|
2008-12-29 13:38:08 +00:00
|
|
|
struct ReportList;
|
2009-04-09 18:11:18 +00:00
|
|
|
struct rcti;
|
2009-05-20 15:20:24 +00:00
|
|
|
struct rctf;
|
This commit frees list ui items from their dependencies to Panel, and hence from all the limitations this implied (mostly, the "only one list per panel" one).
It introduces a new (py-extendable and registrable) RNA type, UIList (roughly similar to Panel one), which currently contains only "standard" list's scroll pos and size (but may be expended to include e.g. some filtering data, etc.). This now makes lists completely independent from Panels!
This UIList has a draw_item callback which allows to customize items' drawing from python, that all addons can now use. Incidentally, this also greatly simplifies the C code of this widget, as we do not code any "special case" here anymore!
To make all this work, other changes were also necessary:
* Now all buttons (uiBut struct) have a 'custom_data' void pointer, used currently to store the uiList struct associated with a given uiLayoutListBox.
* DynamicPaintSurface now exposes a new bool, use_color_preview (readonly), saying whether that surface has some 3D view preview data or not.
* UILayout class has now four new (static) functions, to get the actual icon of any RNA object (important e.g. with materials or textures), and to get an enum item's UI name, description and icon.
* UILayout's label() func now takes an optional 'icon_value' integer parameter, which if not zero will override the 'icon' one (mandatory to use "custom" icons as generated for material/texture/... previews).
Note: not sure whether we should add that one to all UILayout's prop funcs?
Note: will update addons using template list asap.
2012-12-28 09:20:16 +00:00
|
|
|
struct uiList;
|
2009-04-27 18:05:58 +00:00
|
|
|
struct uiStyle;
|
2009-04-11 01:52:27 +00:00
|
|
|
struct uiFontStyle;
|
2009-07-03 11:46:46 +00:00
|
|
|
struct uiWidgetColors;
|
2009-05-20 15:20:24 +00:00
|
|
|
struct ColorBand;
|
2009-06-03 00:14:12 +00:00
|
|
|
struct CurveMapping;
|
2009-06-23 00:45:41 +00:00
|
|
|
struct Image;
|
|
|
|
struct ImageUser;
|
2012-01-22 03:30:07 +00:00
|
|
|
struct wmOperatorType;
|
2009-07-03 11:24:52 +00:00
|
|
|
struct uiWidgetColors;
|
2009-07-21 12:57:55 +00:00
|
|
|
struct Tex;
|
2009-08-18 19:58:27 +00:00
|
|
|
struct MTex;
|
Drag and drop 2.5 integration! Finally, slashdot regulars can use
Blender too now! :)
** Drag works as follows:
- drag-able items are defined by the standard interface ui toolkit
- each button can get this feature, via uiButSetDragXXX(but, ...).
There are calls to define drag-able images, ID blocks, RNA paths,
file paths, and so on. By default you drag an icon, exceptionally
an ImBuf
- Drag items are registered centrally in the WM, it allows more drag
items simultaneous too, but not implemented
** Drop works as follows:
- On mouse release, and if drag items exist in the WM, it converts
the mouse event to an EVT_DROP type. This event then gets the full
drag info as customdata
- drop regions are defined with WM_dropbox_add(), similar to keymaps
you can make a "drop map" this way, which become 'drop map handlers'
in the queues.
- next to that the UI kit handles some common button types (like
accepting ID or names) to be catching a drop event too.
- Every "drop box" has two callbacks:
- poll() = check if the event drag data is relevant for this box
- copy() = fill in custom properties in the dropbox to initialize
an operator
- The dropbox handler then calls its standard Operator with its
dropbox properties.
** Currently implemented
Drag items:
- ID icons in browse buttons
- ID icons in context menu of properties region
- ID icons in outliner and rna viewer
- FileBrowser icons
- FileBrowser preview images
Drag-able icons are subtly visualized by making them brighter a bit
on mouse-over. In case the icon is a button or UI element too (most
cases), the drag-able feature will make the item react to
mouse-release instead of mouse-press.
Drop options:
- UI buttons: ID and text buttons (paste name)
- View3d: Object ID drop copies object
- View3d: Material ID drop assigns to object under cursor
- View3d: Image ID drop assigns to object UV texture under cursor
- Sequencer: Path drop will add either Image or Movie strip
- Image window: Path drop will open image
** Drag and drop Notes:
- Dropping into another Blender window (from same application) works
too. I've added code that passes on mousemoves and clicks to other
windows, without activating them though. This does make using multi-window
Blender a bit friendler.
- Dropping a file path to an image, is not the same as dropping an
Image ID... keep this in mind. Sequencer for example wants paths to
be dropped, textures in 3d window wants an Image ID.
- Although drop boxes could be defined via Python, I suggest they're
part of the UI and editor design (= how we want an editor to work), and
not default offered configurable like keymaps.
- At the moment only one item can be dragged at a time. This is for
several reasons.... For one, Blender doesn't have a well defined
uniform way to define "what is selected" (files, outliner items, etc).
Secondly there's potential conflicts on what todo when you drop mixed
drag sets on spots. All undefined stuff... nice for later.
- Example to bypass the above: a collection of images that form a strip,
should be represented in filewindow as a single sequence anyway.
This then will fit well and gets handled neatly by design.
- Another option to check is to allow multiple options per drop... it
could show the operator as a sort of menu, allowing arrow or scrollwheel
to choose. For time being I'd prefer to try to design a singular drop
though, just offer only one drop action per data type on given spots.
- What does work already, but a tad slow, is to use a function that
detects an object (type) under cursor, so a drag item's option can be
further refined (like drop object on object = parent). (disabled)
** More notes
- Added saving for Region layouts (like split points for toolbar)
- Label buttons now handle mouse over
- File list: added full path entry for drop feature.
- Filesel bugfix: wm_operator_exec() got called there and fully handled,
while WM event code tried same. Added new OPERATOR_HANDLED flag for this.
Maybe python needs it too?
- Cocoa: added window move event, so multi-win setups work OK (didnt save).
- Interface_handlers.c: removed win->active
- Severe area copy bug: area handlers were not set to NULL
- Filesel bugfix: next/prev folder list was not copied on area copies
** Leftover todos
- Cocoa windows seem to hang on cases still... needs check
- Cocoa 'draw overlap' swap doesn't work
- Cocoa window loses focus permanently on using Spotlight
(for these reasons, makefile building has Carbon as default atm)
- ListView templates in UI cannot become dragged yet, needs review...
it consists of two overlapping UI elements, preventing handling icon clicks.
- There's already Ghost library code to handle dropping from OS
into Blender window. I've noticed this code is unfinished for Macs, but
seems to be complete for Windows. Needs test... currently, an external
drop event will print in console when succesfully delivered to Blender's WM.
2010-01-26 18:18:21 +00:00
|
|
|
struct ImBuf;
|
2011-11-07 22:28:49 +00:00
|
|
|
struct bNodeTree;
|
|
|
|
struct bNode;
|
|
|
|
struct bNodeSocket;
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
typedef struct uiBut uiBut;
|
|
|
|
typedef struct uiBlock uiBlock;
|
|
|
|
typedef struct uiPopupBlockHandle uiPopupBlockHandle;
|
2009-04-22 18:39:44 +00:00
|
|
|
typedef struct uiLayout uiLayout;
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
|
|
|
|
/* Defines */
|
|
|
|
|
2013-09-11 05:06:27 +00:00
|
|
|
/* char for splitting strings, aligning shortcuts in menus, users never see */
|
|
|
|
#define UI_SEP_CHAR '|'
|
|
|
|
#define UI_SEP_CHAR_S "|"
|
|
|
|
|
2011-11-07 22:28:49 +00:00
|
|
|
/* names */
|
2012-05-12 20:39:39 +00:00
|
|
|
#define UI_MAX_DRAW_STR 400
|
|
|
|
#define UI_MAX_NAME_STR 128
|
2011-11-07 22:28:49 +00:00
|
|
|
|
2013-02-07 02:03:31 +00:00
|
|
|
/* use for clamping popups within the screen */
|
|
|
|
#define UI_SCREEN_MARGIN 10
|
|
|
|
|
2013-11-21 14:43:08 +01:00
|
|
|
/* uiBlock->dt and uiBut->dt */
|
2012-05-12 20:39:39 +00:00
|
|
|
#define UI_EMBOSS 0 /* use widget style for drawing */
|
|
|
|
#define UI_EMBOSSN 1 /* Nothing, only icon and/or text */
|
|
|
|
#define UI_EMBOSSP 2 /* Pulldown menu style */
|
|
|
|
#define UI_EMBOSST 3 /* Table */
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
/* uiBlock->direction */
|
2013-06-25 10:30:07 +00:00
|
|
|
#define UI_DIRECTION (UI_TOP | UI_DOWN | UI_LEFT | UI_RIGHT)
|
|
|
|
#define UI_TOP (1 << 0)
|
|
|
|
#define UI_DOWN (1 << 1)
|
|
|
|
#define UI_LEFT (1 << 2)
|
|
|
|
#define UI_RIGHT (1 << 3)
|
|
|
|
#define UI_CENTER (1 << 4)
|
|
|
|
#define UI_SHIFT_FLIPPED (1 << 5)
|
|
|
|
|
|
|
|
#if 0
|
2009-02-27 10:22:40 +00:00
|
|
|
/* uiBlock->autofill (not yet used) */
|
2013-06-25 10:30:07 +00:00
|
|
|
#define UI_BLOCK_COLLUMNS 1
|
|
|
|
#define UI_BLOCK_ROWS 2
|
|
|
|
#endif
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
/* uiBlock->flag (controls) */
|
2013-06-25 10:30:07 +00:00
|
|
|
#define UI_BLOCK_LOOP (1 << 0)
|
|
|
|
#define UI_BLOCK_REDRAW (1 << 1)
|
|
|
|
#define UI_BLOCK_SEARCH_MENU (1 << 2)
|
|
|
|
#define UI_BLOCK_NUMSELECT (1 << 3)
|
|
|
|
#define UI_BLOCK_NO_WIN_CLIP (1 << 4) /* don't apply window clipping */ /* was UI_BLOCK_ENTER_OK */
|
|
|
|
#define UI_BLOCK_CLIPBOTTOM (1 << 5)
|
|
|
|
#define UI_BLOCK_CLIPTOP (1 << 6)
|
|
|
|
#define UI_BLOCK_MOVEMOUSE_QUIT (1 << 7)
|
|
|
|
#define UI_BLOCK_KEEP_OPEN (1 << 8)
|
|
|
|
#define UI_BLOCK_POPUP (1 << 9)
|
|
|
|
#define UI_BLOCK_OUT_1 (1 << 10)
|
|
|
|
#define UI_BLOCK_NO_FLIP (1 << 11)
|
|
|
|
#define UI_BLOCK_POPUP_MEMORY (1 << 12)
|
|
|
|
#define UI_BLOCK_CLIP_EVENTS (1 << 13) /* stop handling mouse events */
|
|
|
|
|
2013-11-21 14:43:08 +01:00
|
|
|
/* block->flag bits 14-17 are identical to but->drawflag bits */
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
Fix [#35750] list items in properties editor (text colors not following list item theme).
Issue goes back since we stopped using LISTROW button to draw item's name (i.e. since we have custom buttons in list items!).
This commit:
* Adds a new flag to uiBlock, UI_BLOCK_LIST_ITEM, to mark blocks used for each list item.
* Adds a new button type, LISTLABEL, which basically behaves exactly as LABEL, but uses wcol_list_item color set.
* When uiItemL is called, it checks whether current block has UI_BLOCK_LIST_ITEM set, and if so, switch produced button to LISTLABEL type.
* Adds a new helper func, ui_layout_list_set_labels_active, called after the active list item has been "drawn", to set all LISTLABEL buttons as UI_SELECT.
Note custom widget_state_label() was removed, in interface_widgets.c, as it did nothing more than default widget_state().
Thanks to Brecht for the review and advices.
2013-06-26 07:28:55 +00:00
|
|
|
#define UI_BLOCK_LIST_ITEM (1 << 19)
|
|
|
|
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
/* uiPopupBlockHandle->menuretval */
|
2013-06-25 10:30:07 +00:00
|
|
|
#define UI_RETURN_CANCEL (1 << 0) /* cancel all menus cascading */
|
|
|
|
#define UI_RETURN_OK (1 << 1) /* choice made */
|
|
|
|
#define UI_RETURN_OUT (1 << 2) /* left the menu */
|
|
|
|
#define UI_RETURN_OUT_PARENT (1 << 3) /* let the parent handle this event */
|
|
|
|
#define UI_RETURN_UPDATE (1 << 4) /* update the button that opened */
|
|
|
|
#define UI_RETURN_POPUP_OK (1 << 5) /* popup is ok to be handled */
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
/* panel controls */
|
2013-06-25 10:30:07 +00:00
|
|
|
#define UI_PNL_SOLID (1 << 1)
|
|
|
|
#define UI_PNL_CLOSE (1 << 5)
|
|
|
|
#define UI_PNL_SCALE (1 << 9)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2013-11-21 14:43:08 +01:00
|
|
|
/* but->flag - general state flags. */
|
|
|
|
enum {
|
|
|
|
/* warning, the first 6 flags are internal */
|
|
|
|
UI_ICON_SUBMENU = (1 << 6),
|
|
|
|
UI_ICON_PREVIEW = (1 << 7),
|
|
|
|
|
|
|
|
UI_BUT_NODE_LINK = (1 << 8),
|
|
|
|
UI_BUT_NODE_ACTIVE = (1 << 9),
|
|
|
|
UI_BUT_DRAG_LOCK = (1 << 10),
|
|
|
|
UI_BUT_DISABLED = (1 << 11),
|
|
|
|
UI_BUT_COLOR_LOCK = (1 << 12),
|
|
|
|
UI_BUT_ANIMATED = (1 << 13),
|
|
|
|
UI_BUT_ANIMATED_KEY = (1 << 14),
|
|
|
|
UI_BUT_DRIVEN = (1 << 15),
|
|
|
|
UI_BUT_REDALERT = (1 << 16),
|
|
|
|
UI_BUT_INACTIVE = (1 << 17),
|
|
|
|
UI_BUT_LAST_ACTIVE = (1 << 18),
|
|
|
|
UI_BUT_UNDO = (1 << 19),
|
|
|
|
UI_BUT_IMMEDIATE = (1 << 20),
|
|
|
|
UI_BUT_NO_UTF8 = (1 << 21),
|
|
|
|
|
|
|
|
UI_BUT_VEC_SIZE_LOCK = (1 << 22), /* used to flag if color hsv-circle should keep luminance */
|
|
|
|
UI_BUT_COLOR_CUBIC = (1 << 23), /* cubic saturation for the color wheel */
|
2013-11-21 16:51:29 +01:00
|
|
|
UI_BUT_LIST_ITEM = (1 << 24), /* This but is "inside" a list item (currently used to change theme colors). */
|
2014-02-08 09:34:01 +11:00
|
|
|
UI_BUT_DRAG_MULTI = (1 << 25), /* edit this button as well as the active button (not just dragging) */
|
2013-11-21 14:43:08 +01:00
|
|
|
};
|
2012-05-12 20:39:39 +00:00
|
|
|
|
|
|
|
#define UI_PANEL_WIDTH 340
|
|
|
|
#define UI_COMPACT_PANEL_WIDTH 160
|
2009-04-09 18:11:18 +00:00
|
|
|
|
2014-01-04 13:54:39 -06:00
|
|
|
#define UI_PANEL_CATEGORY_MARGIN_WIDTH (U.widget_unit * 1.0f)
|
2013-12-17 03:21:55 +11:00
|
|
|
|
2013-11-21 14:43:08 +01:00
|
|
|
/* but->drawflag - these flags should only affect how the button is drawn. */
|
2013-11-21 16:51:29 +01:00
|
|
|
/* Note: currently, these flags _are not passed_ to the widget's state() or draw() functions
|
|
|
|
* (except for the 'align' ones)!
|
|
|
|
*/
|
2013-11-21 14:43:08 +01:00
|
|
|
enum {
|
|
|
|
/* Text and icon alignment (by default, they are centered). */
|
|
|
|
UI_BUT_TEXT_LEFT = (1 << 1),
|
|
|
|
UI_BUT_ICON_LEFT = (1 << 2),
|
|
|
|
UI_BUT_TEXT_RIGHT = (1 << 3),
|
|
|
|
/* Prevent the button to show any tooltip. */
|
|
|
|
UI_BUT_NO_TOOLTIP = (1 << 4),
|
|
|
|
/* button align flag, for drawing groups together (also used in uiBlock->flag!) */
|
|
|
|
UI_BUT_ALIGN_TOP = (1 << 14),
|
|
|
|
UI_BUT_ALIGN_LEFT = (1 << 15),
|
|
|
|
UI_BUT_ALIGN_RIGHT = (1 << 16),
|
|
|
|
UI_BUT_ALIGN_DOWN = (1 << 17),
|
|
|
|
UI_BUT_ALIGN = (UI_BUT_ALIGN_TOP | UI_BUT_ALIGN_LEFT | UI_BUT_ALIGN_RIGHT | UI_BUT_ALIGN_DOWN),
|
|
|
|
};
|
2012-05-23 14:24:40 +00:00
|
|
|
|
2012-12-13 01:21:12 +00:00
|
|
|
/* scale fixed button widths by this to account for DPI */
|
Holiday coding log :)
Nice formatted version (pictures soon):
http://wiki.blender.org/index.php/Dev:Ref/Release_Notes/2.66/Usability
Short list of main changes:
- Transparent region option (over main region), added code to blend in/out such panels.
- Min size window now 640 x 480
- Fixed DPI for ui - lots of cleanup and changes everywhere. Icon image need correct size still, layer-in-use icon needs remake.
- Macbook retina support, use command line --no-native-pixels to disable it
- Timeline Marker label was drawing wrong
- Trackpad and magic mouse: supports zoom (hold ctrl)
- Fix for splash position: removed ghost function and made window size update after creation immediate
- Fast undo buffer save now adds UI as well. Could be checked for regular file save even...
Quit.blend and temp file saving use this now.
- Dixed filename in window on reading quit.blend or temp saves, and they now add a warning in window title: "(Recovered)"
- New Userpref option "Keep Session" - this always saves quit.blend, and loads on start.
This allows keeping UI and data without actual saves, until you actually save.
When you load startup.blend and quit, it recognises the quit.blend as a startup (no file name in header)
- Added 3D view copy/paste buffers (selected objects). Shortcuts ctrl-c, ctrl-v (OSX, cmd-c, cmd-v).
Coded partial file saving for it. Could be used for other purposes. Todo: use OS clipboards.
- User preferences (themes, keymaps, user settings) now can be saved as a separate file.
Old option is called "Save Startup File" the new one "Save User Settings".
To visualise this difference, the 'save startup file' button has been removed from user preferences window. That option is available as CTRL+U and in File menu still.
- OSX: fixed bug that stopped giving mouse events outside window.
This also fixes "Continuous Grab" for OSX. (error since 2009)
2012-12-12 18:58:11 +00:00
|
|
|
|
|
|
|
#define UI_DPI_FAC ((U.pixelsize * (float)U.dpi) / 72.0f)
|
2013-02-28 16:37:19 +00:00
|
|
|
#define UI_DPI_WINDOW_FAC (((float)U.dpi) / 72.0f)
|
2011-08-05 10:45:32 +00:00
|
|
|
/* 16 to copy ICON_DEFAULT_HEIGHT */
|
2012-12-13 01:21:12 +00:00
|
|
|
#define UI_DPI_ICON_SIZE ((float)16 * UI_DPI_FAC)
|
2011-05-05 15:21:43 +00:00
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
/* Button types, bits stored in 1 value... and a short even!
|
2012-03-03 16:31:46 +00:00
|
|
|
* - bits 0-4: bitnr (0-31)
|
|
|
|
* - bits 5-7: pointer type
|
|
|
|
* - bit 8: for 'bit'
|
|
|
|
* - bit 9-15: button type (now 6 bits, 64 types)
|
|
|
|
* */
|
2012-09-11 23:10:23 +00:00
|
|
|
typedef enum {
|
2012-09-12 00:32:33 +00:00
|
|
|
UI_BUT_POIN_CHAR = 32,
|
|
|
|
UI_BUT_POIN_SHORT = 64,
|
|
|
|
UI_BUT_POIN_INT = 96,
|
|
|
|
UI_BUT_POIN_FLOAT = 128,
|
|
|
|
/* UI_BUT_POIN_FUNCTION = 192, */ /*UNUSED*/
|
|
|
|
UI_BUT_POIN_BIT = 256 /* OR'd with a bit index*/
|
2012-09-11 23:10:23 +00:00
|
|
|
} eButPointerType;
|
2012-05-12 20:39:39 +00:00
|
|
|
|
2012-09-14 05:44:47 +00:00
|
|
|
/* requires (but->poin != NULL) */
|
2012-09-12 00:32:33 +00:00
|
|
|
#define UI_BUT_POIN_TYPES (UI_BUT_POIN_FLOAT | UI_BUT_POIN_SHORT | UI_BUT_POIN_CHAR)
|
2012-05-12 20:39:39 +00:00
|
|
|
|
2012-07-18 17:29:15 +00:00
|
|
|
/* assigned to but->type, OR'd with the flags above when passing args */
|
2012-09-10 06:05:19 +00:00
|
|
|
typedef enum {
|
|
|
|
BUT = (1 << 9),
|
|
|
|
ROW = (2 << 9),
|
|
|
|
TOG = (3 << 9),
|
|
|
|
NUM = (5 << 9),
|
|
|
|
TEX = (6 << 9),
|
|
|
|
TOGN = (9 << 9),
|
|
|
|
LABEL = (10 << 9),
|
Fix for [#35224] Transform Orientation - order inconsistency
Fix turned out to remove as much "manual UI" from 3D view header as possible. Mode selector and all transform manipulators/orientations stuff are now RNA-based UI (leaving basically only edit mesh select modes with custom handlers, as they have some quite specific features).
To achieve this, four main modifications were done:
* enum-operator-generated menus are now MENU (i.e. dropdown lists) in headers too.
* All bit-flag enums expanded in ROW buttons now have a handling consistent with e.g. layers, or what we already have for transform manipulators, i.e. clicking select only one element, shift-click to select multiple ones.
* Consequently, the three RNA booleans manipulators flags are merged into a single bit-flag enum (yes, this is also an API change, though I doubt many scripts use it).
* Now the width of enum-based dropdown lists is computed from longest item name in enum, no more from a dummy place holder string (when no label/name is given).
All this allows to remove some code from 3DView/transform areas, that was actually mostly duplicating RNA/operator one.
Also done a few optimizations here and there (among others, do not pass &numitems to RNA_property_enum_items() when you do not need it, saves at least an iteration over enum items to count them).
Many thanks to Brecht for the reviews!
2013-05-12 13:16:11 +00:00
|
|
|
MENU = (11 << 9), /* Dropdown list, actually! */
|
2012-09-10 06:05:19 +00:00
|
|
|
ICONTOG = (13 << 9),
|
|
|
|
NUMSLI = (14 << 9),
|
2012-09-11 23:10:23 +00:00
|
|
|
COLOR = (15 << 9),
|
2012-09-10 06:05:19 +00:00
|
|
|
SCROLL = (18 << 9),
|
|
|
|
BLOCK = (19 << 9),
|
|
|
|
BUTM = (20 << 9),
|
|
|
|
SEPR = (21 << 9),
|
|
|
|
LINK = (22 << 9),
|
|
|
|
INLINK = (23 << 9),
|
|
|
|
KEYEVT = (24 << 9),
|
|
|
|
HSVCUBE = (26 << 9),
|
Fix for [#35224] Transform Orientation - order inconsistency
Fix turned out to remove as much "manual UI" from 3D view header as possible. Mode selector and all transform manipulators/orientations stuff are now RNA-based UI (leaving basically only edit mesh select modes with custom handlers, as they have some quite specific features).
To achieve this, four main modifications were done:
* enum-operator-generated menus are now MENU (i.e. dropdown lists) in headers too.
* All bit-flag enums expanded in ROW buttons now have a handling consistent with e.g. layers, or what we already have for transform manipulators, i.e. clicking select only one element, shift-click to select multiple ones.
* Consequently, the three RNA booleans manipulators flags are merged into a single bit-flag enum (yes, this is also an API change, though I doubt many scripts use it).
* Now the width of enum-based dropdown lists is computed from longest item name in enum, no more from a dummy place holder string (when no label/name is given).
All this allows to remove some code from 3DView/transform areas, that was actually mostly duplicating RNA/operator one.
Also done a few optimizations here and there (among others, do not pass &numitems to RNA_property_enum_items() when you do not need it, saves at least an iteration over enum items to count them).
Many thanks to Brecht for the reviews!
2013-05-12 13:16:11 +00:00
|
|
|
PULLDOWN = (27 << 9), /* Menu, actually! */
|
2012-09-10 06:05:19 +00:00
|
|
|
ROUNDBOX = (28 << 9),
|
|
|
|
BUT_COLORBAND = (30 << 9),
|
|
|
|
BUT_NORMAL = (31 << 9),
|
|
|
|
BUT_CURVE = (32 << 9),
|
|
|
|
ICONTOGN = (34 << 9),
|
2013-07-09 23:40:53 +00:00
|
|
|
LISTBOX = (35 << 9),
|
|
|
|
LISTROW = (36 << 9),
|
2012-09-10 06:05:19 +00:00
|
|
|
TOGBUT = (37 << 9),
|
|
|
|
OPTION = (38 << 9),
|
|
|
|
OPTIONN = (39 << 9),
|
|
|
|
TRACKPREVIEW = (40 << 9),
|
|
|
|
/* buttons with value >= SEARCH_MENU don't get undo pushes */
|
|
|
|
SEARCH_MENU = (41 << 9),
|
|
|
|
BUT_EXTRA = (42 << 9),
|
|
|
|
HSVCIRCLE = (43 << 9),
|
|
|
|
HOTKEYEVT = (46 << 9),
|
|
|
|
BUT_IMAGE = (47 << 9),
|
|
|
|
HISTOGRAM = (48 << 9),
|
|
|
|
WAVEFORM = (49 << 9),
|
|
|
|
VECTORSCOPE = (50 << 9),
|
2013-01-09 15:58:34 +00:00
|
|
|
PROGRESSBAR = (51 << 9),
|
2013-03-18 16:34:57 +00:00
|
|
|
SEARCH_MENU_UNLINK = (52 << 9),
|
Fix [#35750] list items in properties editor (text colors not following list item theme).
Issue goes back since we stopped using LISTROW button to draw item's name (i.e. since we have custom buttons in list items!).
This commit:
* Adds a new flag to uiBlock, UI_BLOCK_LIST_ITEM, to mark blocks used for each list item.
* Adds a new button type, LISTLABEL, which basically behaves exactly as LABEL, but uses wcol_list_item color set.
* When uiItemL is called, it checks whether current block has UI_BLOCK_LIST_ITEM set, and if so, switch produced button to LISTLABEL type.
* Adds a new helper func, ui_layout_list_set_labels_active, called after the active list item has been "drawn", to set all LISTLABEL buttons as UI_SELECT.
Note custom widget_state_label() was removed, in interface_widgets.c, as it did nothing more than default widget_state().
Thanks to Brecht for the review and advices.
2013-06-26 07:28:55 +00:00
|
|
|
NODESOCKET = (53 << 9),
|
2014-01-17 00:23:00 +01:00
|
|
|
SEPRLINE = (54 << 9),
|
2012-09-10 06:05:19 +00:00
|
|
|
} eButType;
|
2012-05-12 20:39:39 +00:00
|
|
|
|
|
|
|
#define BUTTYPE (63 << 9)
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2010-01-21 00:00:45 +00:00
|
|
|
/* gradient types, for color picker HSVCUBE etc */
|
2012-05-12 20:39:39 +00:00
|
|
|
#define UI_GRAD_SV 0
|
|
|
|
#define UI_GRAD_HV 1
|
|
|
|
#define UI_GRAD_HS 2
|
|
|
|
#define UI_GRAD_H 3
|
|
|
|
#define UI_GRAD_S 4
|
|
|
|
#define UI_GRAD_V 5
|
2010-01-21 00:00:45 +00:00
|
|
|
|
2012-05-12 20:39:39 +00:00
|
|
|
#define UI_GRAD_V_ALT 9
|
2014-03-12 18:58:50 +02:00
|
|
|
#define UI_GRAD_L_ALT 10
|
2010-01-21 00:00:45 +00:00
|
|
|
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
/* Drawing
|
|
|
|
*
|
|
|
|
* Functions to draw various shapes, taking theme settings into account.
|
|
|
|
* Used for code that draws its own UI style elements. */
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
|
|
|
|
void uiRoundBox(float minx, float miny, float maxx, float maxy, float rad);
|
|
|
|
void uiSetRoundBox(int type);
|
2009-03-08 17:12:59 +00:00
|
|
|
int uiGetRoundBox(void);
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
void uiRoundRect(float minx, float miny, float maxx, float maxy, float rad);
|
|
|
|
void uiDrawBoxShadow(unsigned char alpha, float minx, float miny, float maxx, float maxy);
|
2010-09-15 12:18:50 +00:00
|
|
|
void uiDrawBox(int mode, float minx, float miny, float maxx, float maxy, float rad);
|
|
|
|
void uiDrawBoxShade(int mode, float minx, float miny, float maxx, float maxy, float rad, float shadetop, float shadedown);
|
|
|
|
void uiDrawBoxVerticalShade(int mode, float minx, float miny, float maxx, float maxy, float rad, float shadeLeft, float shadeRight);
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
|
2009-07-03 13:21:09 +00:00
|
|
|
/* state for scrolldrawing */
|
2013-06-25 10:30:07 +00:00
|
|
|
#define UI_SCROLL_PRESSED (1 << 0)
|
|
|
|
#define UI_SCROLL_ARROWS (1 << 1)
|
|
|
|
#define UI_SCROLL_NO_OUTLINE (1 << 2)
|
2012-12-13 02:40:49 +00:00
|
|
|
void uiWidgetScrollDraw(struct uiWidgetColors *wcol, const struct rcti *rect, const struct rcti *slider, int state);
|
2009-07-03 10:54:39 +00:00
|
|
|
|
2011-11-24 13:51:31 +00:00
|
|
|
/* Callbacks
|
|
|
|
*
|
|
|
|
* uiBlockSetHandleFunc/ButmFunc are for handling events through a callback.
|
|
|
|
* HandleFunc gets the retval passed on, and ButmFunc gets a2. The latter is
|
|
|
|
* mostly for compatibility with older code.
|
|
|
|
*
|
|
|
|
* uiButSetCompleteFunc is for tab completion.
|
|
|
|
*
|
|
|
|
* uiButSearchFunc is for name buttons, showing a popup with matches
|
|
|
|
*
|
|
|
|
* uiBlockSetFunc and uiButSetFunc are callbacks run when a button is used,
|
|
|
|
* in case events, operators or RNA are not sufficient to handle the button.
|
|
|
|
*
|
|
|
|
* uiButSetNFunc will free the argument with MEM_freeN. */
|
|
|
|
|
|
|
|
typedef struct uiSearchItems uiSearchItems;
|
|
|
|
|
|
|
|
typedef void (*uiButHandleFunc)(struct bContext *C, void *arg1, void *arg2);
|
|
|
|
typedef void (*uiButHandleRenameFunc)(struct bContext *C, void *arg, char *origstr);
|
|
|
|
typedef void (*uiButHandleNFunc)(struct bContext *C, void *argN, void *arg2);
|
2013-11-22 01:35:38 +01:00
|
|
|
typedef int (*uiButCompleteFunc)(struct bContext *C, char *str, void *arg);
|
2011-11-24 13:51:31 +00:00
|
|
|
typedef void (*uiButSearchFunc)(const struct bContext *C, void *arg, const char *str, uiSearchItems *items);
|
|
|
|
typedef void (*uiBlockHandleFunc)(struct bContext *C, void *arg, int event);
|
|
|
|
|
2009-04-22 18:39:44 +00:00
|
|
|
/* Menu Callbacks */
|
2009-01-25 20:22:05 +00:00
|
|
|
|
2009-04-22 18:39:44 +00:00
|
|
|
typedef void (*uiMenuCreateFunc)(struct bContext *C, struct uiLayout *layout, void *arg1);
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
typedef void (*uiMenuHandleFunc)(struct bContext *C, void *arg, int event);
|
2009-01-25 20:22:05 +00:00
|
|
|
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
/* Popup Menus
|
|
|
|
*
|
|
|
|
* Functions used to create popup menus. For more extended menus the
|
|
|
|
* uiPupMenuBegin/End functions can be used to define own items with
|
2011-01-02 11:06:50 +00:00
|
|
|
* the uiItem functions in between. If it is a simple confirmation menu
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
* or similar, popups can be created with a single function call. */
|
|
|
|
|
2009-04-22 18:39:44 +00:00
|
|
|
typedef struct uiPopupMenu uiPopupMenu;
|
|
|
|
|
2013-09-01 15:01:15 +00:00
|
|
|
struct uiPopupMenu *uiPupMenuBegin(struct bContext *C, const char *title, int icon) ATTR_NONNULL();
|
2009-04-22 18:39:44 +00:00
|
|
|
void uiPupMenuEnd(struct bContext *C, struct uiPopupMenu *head);
|
|
|
|
struct uiLayout *uiPupMenuLayout(uiPopupMenu *head);
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
|
2014-02-09 12:00:03 +11:00
|
|
|
void uiPupMenuReports(struct bContext *C, struct ReportList *reports) ATTR_NONNULL();
|
2014-02-10 14:17:33 +11:00
|
|
|
bool uiPupMenuInvoke(struct bContext *C, const char *idname, struct ReportList *reports) ATTR_NONNULL(1, 2);
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
|
|
|
|
/* Popup Blocks
|
|
|
|
*
|
|
|
|
* Functions used to create popup blocks. These are like popup menus
|
|
|
|
* but allow using all button types and creating an own layout. */
|
|
|
|
|
2012-05-12 20:39:39 +00:00
|
|
|
typedef uiBlock * (*uiBlockCreateFunc)(struct bContext *C, struct ARegion *ar, void *arg1);
|
2013-02-19 13:37:48 +00:00
|
|
|
typedef void (*uiBlockCancelFunc)(struct bContext *C, void *arg1);
|
2009-02-04 11:52:16 +00:00
|
|
|
|
|
|
|
void uiPupBlock(struct bContext *C, uiBlockCreateFunc func, void *arg);
|
2010-11-17 09:45:45 +00:00
|
|
|
void uiPupBlockO(struct bContext *C, uiBlockCreateFunc func, void *arg, const char *opname, int opcontext);
|
2011-11-24 13:51:31 +00:00
|
|
|
void uiPupBlockEx(struct bContext *C, uiBlockCreateFunc func, uiBlockHandleFunc popup_func, uiBlockCancelFunc cancel_func, void *arg);
|
2011-06-24 03:30:50 +00:00
|
|
|
/* void uiPupBlockOperator(struct bContext *C, uiBlockCreateFunc func, struct wmOperator *op, int opcontext); */ /* UNUSED */
|
2009-01-25 20:22:05 +00:00
|
|
|
|
2009-11-23 13:58:55 +00:00
|
|
|
void uiPupBlockClose(struct bContext *C, uiBlock *block);
|
|
|
|
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
/* Blocks
|
|
|
|
*
|
|
|
|
* Functions for creating, drawing and freeing blocks. A Block is a
|
|
|
|
* container of buttons and used for various purposes.
|
|
|
|
*
|
|
|
|
* Begin/Define Buttons/End/Draw is the typical order in which these
|
|
|
|
* function should be called, though for popup blocks Draw is left out.
|
|
|
|
* Freeing blocks is done by the screen/ module automatically.
|
|
|
|
*
|
|
|
|
* */
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
|
2.5
More cleanup!
- removed old UI font completely, including from uiBeginBlock
- emboss hints for uiBlock only have three types now;
Regular, Pulldown, or "Nothing" (only icon/text)
- removed old font path from Userdef
- removed all old button theme hinting
- removed old "auto block" to merge buttons in groups
(was only in use for radiosity buttons)
And went over all warnings. One hooray for make giving clean output :)
Well, we need uniform definitions for warnings, so people at least fix
them... here's the real bad bugs I found:
- in mesh code, a call to editmesh mixed *em and *me
- in armature, ED_util.h was not included, so no warnings for wrong call
to ED_undo_push()
- The extern Py api .h was not included in the bpy_interface.c, showing
a several calls using different args.
Further just added the missing includes, and removed unused vars.
2009-04-14 15:59:52 +00:00
|
|
|
uiBlock *uiBeginBlock(const struct bContext *C, struct ARegion *region, const char *name, short dt);
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
void uiEndBlock(const struct bContext *C, uiBlock *block);
|
2008-12-26 13:11:04 +00:00
|
|
|
void uiDrawBlock(const struct bContext *C, struct uiBlock *block);
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
|
2010-11-17 09:45:45 +00:00
|
|
|
uiBlock *uiGetBlock(const char *name, struct ARegion *ar);
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
|
2010-10-07 00:14:21 +00:00
|
|
|
void uiBlockSetEmboss(uiBlock *block, char dt);
|
2.5
More cleanup!
- removed old UI font completely, including from uiBeginBlock
- emboss hints for uiBlock only have three types now;
Regular, Pulldown, or "Nothing" (only icon/text)
- removed old font path from Userdef
- removed all old button theme hinting
- removed old "auto block" to merge buttons in groups
(was only in use for radiosity buttons)
And went over all warnings. One hooray for make giving clean output :)
Well, we need uniform definitions for warnings, so people at least fix
them... here's the real bad bugs I found:
- in mesh code, a call to editmesh mixed *em and *me
- in armature, ED_util.h was not included, so no warnings for wrong call
to ED_undo_push()
- The extern Py api .h was not included in the bpy_interface.c, showing
a several calls using different args.
Further just added the missing includes, and removed unused vars.
2009-04-14 15:59:52 +00:00
|
|
|
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
void uiFreeBlock(const struct bContext *C, uiBlock *block);
|
|
|
|
void uiFreeBlocks(const struct bContext *C, struct ListBase *lb);
|
2008-12-26 13:11:04 +00:00
|
|
|
void uiFreeInactiveBlocks(const struct bContext *C, struct ListBase *lb);
|
2011-03-09 18:42:35 +00:00
|
|
|
void uiFreeActiveButtons(const struct bContext *C, struct bScreen *screen);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
|
|
|
void uiBlockSetRegion(uiBlock *block, struct ARegion *region);
|
|
|
|
|
2013-04-04 02:05:11 +00:00
|
|
|
void uiBlockSetButLock(uiBlock *block, bool val, const char *lockstr);
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
void uiBlockClearButLock(uiBlock *block);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
/* automatic aligning, horiz or verical */
|
|
|
|
void uiBlockBeginAlign(uiBlock *block);
|
|
|
|
void uiBlockEndAlign(uiBlock *block);
|
|
|
|
|
2009-11-23 13:58:55 +00:00
|
|
|
/* block bounds/position calculation */
|
2012-09-10 07:03:30 +00:00
|
|
|
typedef enum {
|
|
|
|
UI_BLOCK_BOUNDS_NONE = 0,
|
2012-05-12 20:39:39 +00:00
|
|
|
UI_BLOCK_BOUNDS = 1,
|
2009-11-23 13:58:55 +00:00
|
|
|
UI_BLOCK_BOUNDS_TEXT,
|
|
|
|
UI_BLOCK_BOUNDS_POPUP_MOUSE,
|
|
|
|
UI_BLOCK_BOUNDS_POPUP_MENU,
|
2010-12-03 01:52:28 +00:00
|
|
|
UI_BLOCK_BOUNDS_POPUP_CENTER
|
2009-11-23 13:58:55 +00:00
|
|
|
} eBlockBoundsCalc;
|
|
|
|
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
void uiBoundsBlock(struct uiBlock *block, int addval);
|
|
|
|
void uiTextBoundsBlock(uiBlock *block, int addval);
|
2009-02-04 11:52:16 +00:00
|
|
|
void uiPopupBoundsBlock(uiBlock *block, int addval, int mx, int my);
|
|
|
|
void uiMenuPopupBoundsBlock(uiBlock *block, int addvall, int mx, int my);
|
2009-11-23 13:58:55 +00:00
|
|
|
void uiCenteredBoundsBlock(uiBlock *block, int addval);
|
2011-11-22 17:49:06 +00:00
|
|
|
void uiExplicitBoundsBlock(uiBlock *block, int minx, int miny, int maxx, int maxy);
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
|
2012-05-12 20:39:39 +00:00
|
|
|
int uiBlocksGetYMin(struct ListBase *lb);
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
|
2014-02-12 10:29:15 +11:00
|
|
|
void uiBlockSetDirection(uiBlock *block, char direction);
|
2012-05-12 20:39:39 +00:00
|
|
|
void uiBlockFlipOrder(uiBlock *block);
|
|
|
|
void uiBlockSetFlag(uiBlock *block, int flag);
|
|
|
|
void uiBlockClearFlag(uiBlock *block, int flag);
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
|
2012-05-12 20:39:39 +00:00
|
|
|
int uiButGetRetVal(uiBut *but);
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
|
2012-05-12 20:39:39 +00:00
|
|
|
void uiButSetDragID(uiBut *but, struct ID *id);
|
|
|
|
void uiButSetDragRNA(uiBut *but, struct PointerRNA *ptr);
|
|
|
|
void uiButSetDragPath(uiBut *but, const char *path);
|
|
|
|
void uiButSetDragName(uiBut *but, const char *name);
|
|
|
|
void uiButSetDragValue(uiBut *but);
|
|
|
|
void uiButSetDragImage(uiBut *but, const char *path, int icon, struct ImBuf *ima, float scale);
|
Drag and drop 2.5 integration! Finally, slashdot regulars can use
Blender too now! :)
** Drag works as follows:
- drag-able items are defined by the standard interface ui toolkit
- each button can get this feature, via uiButSetDragXXX(but, ...).
There are calls to define drag-able images, ID blocks, RNA paths,
file paths, and so on. By default you drag an icon, exceptionally
an ImBuf
- Drag items are registered centrally in the WM, it allows more drag
items simultaneous too, but not implemented
** Drop works as follows:
- On mouse release, and if drag items exist in the WM, it converts
the mouse event to an EVT_DROP type. This event then gets the full
drag info as customdata
- drop regions are defined with WM_dropbox_add(), similar to keymaps
you can make a "drop map" this way, which become 'drop map handlers'
in the queues.
- next to that the UI kit handles some common button types (like
accepting ID or names) to be catching a drop event too.
- Every "drop box" has two callbacks:
- poll() = check if the event drag data is relevant for this box
- copy() = fill in custom properties in the dropbox to initialize
an operator
- The dropbox handler then calls its standard Operator with its
dropbox properties.
** Currently implemented
Drag items:
- ID icons in browse buttons
- ID icons in context menu of properties region
- ID icons in outliner and rna viewer
- FileBrowser icons
- FileBrowser preview images
Drag-able icons are subtly visualized by making them brighter a bit
on mouse-over. In case the icon is a button or UI element too (most
cases), the drag-able feature will make the item react to
mouse-release instead of mouse-press.
Drop options:
- UI buttons: ID and text buttons (paste name)
- View3d: Object ID drop copies object
- View3d: Material ID drop assigns to object under cursor
- View3d: Image ID drop assigns to object UV texture under cursor
- Sequencer: Path drop will add either Image or Movie strip
- Image window: Path drop will open image
** Drag and drop Notes:
- Dropping into another Blender window (from same application) works
too. I've added code that passes on mousemoves and clicks to other
windows, without activating them though. This does make using multi-window
Blender a bit friendler.
- Dropping a file path to an image, is not the same as dropping an
Image ID... keep this in mind. Sequencer for example wants paths to
be dropped, textures in 3d window wants an Image ID.
- Although drop boxes could be defined via Python, I suggest they're
part of the UI and editor design (= how we want an editor to work), and
not default offered configurable like keymaps.
- At the moment only one item can be dragged at a time. This is for
several reasons.... For one, Blender doesn't have a well defined
uniform way to define "what is selected" (files, outliner items, etc).
Secondly there's potential conflicts on what todo when you drop mixed
drag sets on spots. All undefined stuff... nice for later.
- Example to bypass the above: a collection of images that form a strip,
should be represented in filewindow as a single sequence anyway.
This then will fit well and gets handled neatly by design.
- Another option to check is to allow multiple options per drop... it
could show the operator as a sort of menu, allowing arrow or scrollwheel
to choose. For time being I'd prefer to try to design a singular drop
though, just offer only one drop action per data type on given spots.
- What does work already, but a tad slow, is to use a function that
detects an object (type) under cursor, so a drag item's option can be
further refined (like drop object on object = parent). (disabled)
** More notes
- Added saving for Region layouts (like split points for toolbar)
- Label buttons now handle mouse over
- File list: added full path entry for drop feature.
- Filesel bugfix: wm_operator_exec() got called there and fully handled,
while WM event code tried same. Added new OPERATOR_HANDLED flag for this.
Maybe python needs it too?
- Cocoa: added window move event, so multi-win setups work OK (didnt save).
- Interface_handlers.c: removed win->active
- Severe area copy bug: area handlers were not set to NULL
- Filesel bugfix: next/prev folder list was not copied on area copies
** Leftover todos
- Cocoa windows seem to hang on cases still... needs check
- Cocoa 'draw overlap' swap doesn't work
- Cocoa window loses focus permanently on using Spotlight
(for these reasons, makefile building has Carbon as default atm)
- ListView templates in UI cannot become dragged yet, needs review...
it consists of two overlapping UI elements, preventing handling icon clicks.
- There's already Ghost library code to handle dropping from OS
into Blender window. I've noticed this code is unfinished for Macs, but
seems to be complete for Windows. Needs test... currently, an external
drop event will print in console when succesfully delivered to Blender's WM.
2010-01-26 18:18:21 +00:00
|
|
|
|
2012-05-12 20:39:39 +00:00
|
|
|
int UI_but_active_drop_name(struct bContext *C);
|
Drag and drop 2.5 integration! Finally, slashdot regulars can use
Blender too now! :)
** Drag works as follows:
- drag-able items are defined by the standard interface ui toolkit
- each button can get this feature, via uiButSetDragXXX(but, ...).
There are calls to define drag-able images, ID blocks, RNA paths,
file paths, and so on. By default you drag an icon, exceptionally
an ImBuf
- Drag items are registered centrally in the WM, it allows more drag
items simultaneous too, but not implemented
** Drop works as follows:
- On mouse release, and if drag items exist in the WM, it converts
the mouse event to an EVT_DROP type. This event then gets the full
drag info as customdata
- drop regions are defined with WM_dropbox_add(), similar to keymaps
you can make a "drop map" this way, which become 'drop map handlers'
in the queues.
- next to that the UI kit handles some common button types (like
accepting ID or names) to be catching a drop event too.
- Every "drop box" has two callbacks:
- poll() = check if the event drag data is relevant for this box
- copy() = fill in custom properties in the dropbox to initialize
an operator
- The dropbox handler then calls its standard Operator with its
dropbox properties.
** Currently implemented
Drag items:
- ID icons in browse buttons
- ID icons in context menu of properties region
- ID icons in outliner and rna viewer
- FileBrowser icons
- FileBrowser preview images
Drag-able icons are subtly visualized by making them brighter a bit
on mouse-over. In case the icon is a button or UI element too (most
cases), the drag-able feature will make the item react to
mouse-release instead of mouse-press.
Drop options:
- UI buttons: ID and text buttons (paste name)
- View3d: Object ID drop copies object
- View3d: Material ID drop assigns to object under cursor
- View3d: Image ID drop assigns to object UV texture under cursor
- Sequencer: Path drop will add either Image or Movie strip
- Image window: Path drop will open image
** Drag and drop Notes:
- Dropping into another Blender window (from same application) works
too. I've added code that passes on mousemoves and clicks to other
windows, without activating them though. This does make using multi-window
Blender a bit friendler.
- Dropping a file path to an image, is not the same as dropping an
Image ID... keep this in mind. Sequencer for example wants paths to
be dropped, textures in 3d window wants an Image ID.
- Although drop boxes could be defined via Python, I suggest they're
part of the UI and editor design (= how we want an editor to work), and
not default offered configurable like keymaps.
- At the moment only one item can be dragged at a time. This is for
several reasons.... For one, Blender doesn't have a well defined
uniform way to define "what is selected" (files, outliner items, etc).
Secondly there's potential conflicts on what todo when you drop mixed
drag sets on spots. All undefined stuff... nice for later.
- Example to bypass the above: a collection of images that form a strip,
should be represented in filewindow as a single sequence anyway.
This then will fit well and gets handled neatly by design.
- Another option to check is to allow multiple options per drop... it
could show the operator as a sort of menu, allowing arrow or scrollwheel
to choose. For time being I'd prefer to try to design a singular drop
though, just offer only one drop action per data type on given spots.
- What does work already, but a tad slow, is to use a function that
detects an object (type) under cursor, so a drag item's option can be
further refined (like drop object on object = parent). (disabled)
** More notes
- Added saving for Region layouts (like split points for toolbar)
- Label buttons now handle mouse over
- File list: added full path entry for drop feature.
- Filesel bugfix: wm_operator_exec() got called there and fully handled,
while WM event code tried same. Added new OPERATOR_HANDLED flag for this.
Maybe python needs it too?
- Cocoa: added window move event, so multi-win setups work OK (didnt save).
- Interface_handlers.c: removed win->active
- Severe area copy bug: area handlers were not set to NULL
- Filesel bugfix: next/prev folder list was not copied on area copies
** Leftover todos
- Cocoa windows seem to hang on cases still... needs check
- Cocoa 'draw overlap' swap doesn't work
- Cocoa window loses focus permanently on using Spotlight
(for these reasons, makefile building has Carbon as default atm)
- ListView templates in UI cannot become dragged yet, needs review...
it consists of two overlapping UI elements, preventing handling icon clicks.
- There's already Ghost library code to handle dropping from OS
into Blender window. I've noticed this code is unfinished for Macs, but
seems to be complete for Windows. Needs test... currently, an external
drop event will print in console when succesfully delivered to Blender's WM.
2010-01-26 18:18:21 +00:00
|
|
|
|
2012-05-12 20:39:39 +00:00
|
|
|
void uiButSetFlag(uiBut *but, int flag);
|
|
|
|
void uiButClearFlag(uiBut *but, int flag);
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
|
2012-05-23 14:24:40 +00:00
|
|
|
void uiButSetDrawFlag(uiBut *but, int flag);
|
|
|
|
void uiButClearDrawFlag(uiBut *but, int flag);
|
|
|
|
|
2014-02-12 00:08:54 +11:00
|
|
|
void uiButSetMenuFromPulldown(uiBut *but);
|
|
|
|
|
2009-07-25 13:40:59 +00:00
|
|
|
/* special button case, only draw it when used actively, for outliner etc */
|
2013-04-04 02:05:11 +00:00
|
|
|
bool uiButActiveOnly(const struct bContext *C, struct ARegion *ar, uiBlock *block, uiBut *but);
|
2009-07-25 13:40:59 +00:00
|
|
|
|
2013-02-22 05:56:20 +00:00
|
|
|
void uiButExecute(const struct bContext *C, uiBut *but);
|
|
|
|
|
2009-07-25 13:40:59 +00:00
|
|
|
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
/* Buttons
|
|
|
|
*
|
|
|
|
* Functions to define various types of buttons in a block. Postfixes:
|
|
|
|
* - F: float
|
|
|
|
* - I: int
|
|
|
|
* - S: short
|
|
|
|
* - C: char
|
|
|
|
* - R: RNA
|
|
|
|
* - O: operator */
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
uiBut *uiDefBut(uiBlock *block,
|
2012-05-12 20:39:39 +00:00
|
|
|
int type, int retval, const char *str,
|
|
|
|
int x1, int y1,
|
|
|
|
short x2, short y2,
|
|
|
|
void *poin,
|
|
|
|
float min, float max,
|
|
|
|
float a1, float a2, const char *tip);
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefButF(uiBlock *block, int type, int retval, const char *str, int x, int y, short width, short height, float *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefButBitF(uiBlock *block, int type, int bit, int retval, const char *str, int x, int y, short width, short height, float *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefButI(uiBlock *block, int type, int retval, const char *str, int x, int y, short width, short height, int *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefButBitI(uiBlock *block, int type, int bit, int retval, const char *str, int x, int y, short width, short height, int *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefButS(uiBlock *block, int type, int retval, const char *str, int x, int y, short width, short height, short *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefButBitS(uiBlock *block, int type, int bit, int retval, const char *str, int x, int y, short width, short height, short *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefButC(uiBlock *block, int type, int retval, const char *str, int x, int y, short width, short height, char *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefButBitC(uiBlock *block, int type, int bit, int retval, const char *str, int x, int y, short width, short height, char *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefButR(uiBlock *block, int type, int retval, const char *str, int x, int y, short width, short height, struct PointerRNA *ptr, const char *propname, int index, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefButR_prop(uiBlock *block, int type, int retval, const char *str, int x, int y, short width, short height, struct PointerRNA *ptr, struct PropertyRNA *prop, int index, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefButO(uiBlock *block, int type, const char *opname, int opcontext, const char *str, int x, int y, short width, short height, const char *tip);
|
|
|
|
uiBut *uiDefButO_ptr(uiBlock *block, int type, struct wmOperatorType *ot, int opcontext, const char *str, int x, int y, short width, short height, const char *tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
uiBut *uiDefIconBut(uiBlock *block,
|
2012-05-12 20:39:39 +00:00
|
|
|
int type, int retval, int icon,
|
|
|
|
int x1, int y1,
|
|
|
|
short x2, short y2,
|
|
|
|
void *poin,
|
|
|
|
float min, float max,
|
|
|
|
float a1, float a2, const char *tip);
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconButF(uiBlock *block, int type, int retval, int icon, int x, int y, short width, short height, float *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefIconButBitF(uiBlock *block, int type, int bit, int retval, int icon, int x, int y, short width, short height, float *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefIconButI(uiBlock *block, int type, int retval, int icon, int x, int y, short width, short height, int *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefIconButBitI(uiBlock *block, int type, int bit, int retval, int icon, int x, int y, short width, short height, int *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefIconButS(uiBlock *block, int type, int retval, int icon, int x, int y, short width, short height, short *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefIconButBitS(uiBlock *block, int type, int bit, int retval, int icon, int x, int y, short width, short height, short *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefIconButC(uiBlock *block, int type, int retval, int icon, int x, int y, short width, short height, char *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefIconButBitC(uiBlock *block, int type, int bit, int retval, int icon, int x, int y, short width, short height, char *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefIconButR(uiBlock *block, int type, int retval, int icon, int x, int y, short width, short height, struct PointerRNA *ptr, const char *propname, int index, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefIconButR_prop(uiBlock *block, int type, int retval, int icon, int x, int y, short width, short height, struct PointerRNA *ptr, PropertyRNA *prop, int index, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefIconButO(uiBlock *block, int type, const char *opname, int opcontext, int icon, int x, int y, short width, short height, const char *tip);
|
|
|
|
uiBut *uiDefIconButO_ptr(uiBlock *block, int type, struct wmOperatorType *ot, int opcontext, int icon, int x, int y, short width, short height, const char *tip);
|
2008-12-16 07:55:43 +00:00
|
|
|
|
|
|
|
uiBut *uiDefIconTextBut(uiBlock *block,
|
2012-05-12 20:39:39 +00:00
|
|
|
int type, int retval, int icon, const char *str,
|
|
|
|
int x1, int y1,
|
|
|
|
short x2, short y2,
|
|
|
|
void *poin,
|
|
|
|
float min, float max,
|
|
|
|
float a1, float a2, const char *tip);
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconTextButF(uiBlock *block, int type, int retval, int icon, const char *str, int x, int y, short width, short height, float *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefIconTextButBitF(uiBlock *block, int type, int bit, int retval, int icon, const char *str, int x, int y, short width, short height, float *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefIconTextButI(uiBlock *block, int type, int retval, int icon, const char *str, int x, int y, short width, short height, int *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefIconTextButBitI(uiBlock *block, int type, int bit, int retval, int icon, const char *str, int x, int y, short width, short height, int *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefIconTextButS(uiBlock *block, int type, int retval, int icon, const char *str, int x, int y, short width, short height, short *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefIconTextButBitS(uiBlock *block, int type, int bit, int retval, int icon, const char *str, int x, int y, short width, short height, short *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefIconTextButC(uiBlock *block, int type, int retval, int icon, const char *str, int x, int y, short width, short height, char *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefIconTextButBitC(uiBlock *block, int type, int bit, int retval, int icon, const char *str, int x, int y, short width, short height, char *poin, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefIconTextButR(uiBlock *block, int type, int retval, int icon, const char *str, int x, int y, short width, short height, struct PointerRNA *ptr, const char *propname, int index, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefIconTextButR_prop(uiBlock *block, int type, int retval, int icon, const char *str, int x, int y, short width, short height, struct PointerRNA *ptr, struct PropertyRNA *prop, int index, float min, float max, float a1, float a2, const char *tip);
|
|
|
|
uiBut *uiDefIconTextButO(uiBlock *block, int type, const char *opname, int opcontext, int icon, const char *str, int x, int y, short width, short height, const char *tip);
|
|
|
|
uiBut *uiDefIconTextButO_ptr(uiBlock *block, int type, struct wmOperatorType *ot, int opcontext, int icon, const char *str, int x, int y, short width, short height, const char *tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
/* for passing inputs to ButO buttons */
|
|
|
|
struct PointerRNA *uiButGetOperatorPtrRNA(uiBut *but);
|
|
|
|
|
2010-12-10 04:10:21 +00:00
|
|
|
void uiButSetUnitType(uiBut *but, const int unit_type);
|
2013-12-18 17:47:38 +11:00
|
|
|
int uiButGetUnitType(const uiBut *but);
|
2010-12-10 04:10:21 +00:00
|
|
|
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
enum {
|
|
|
|
BUT_GET_RNAPROP_IDENTIFIER = 1,
|
|
|
|
BUT_GET_RNASTRUCT_IDENTIFIER,
|
|
|
|
BUT_GET_RNAENUM_IDENTIFIER,
|
|
|
|
BUT_GET_LABEL,
|
|
|
|
BUT_GET_RNA_LABEL,
|
|
|
|
BUT_GET_RNAENUM_LABEL,
|
|
|
|
BUT_GET_RNA_LABEL_CONTEXT, /* Context specified in CTX_XXX_ macros are just unreachable! */
|
|
|
|
BUT_GET_TIP,
|
|
|
|
BUT_GET_RNA_TIP,
|
|
|
|
BUT_GET_RNAENUM_TIP,
|
2013-07-07 12:53:15 +00:00
|
|
|
BUT_GET_OP_KEYMAP,
|
|
|
|
BUT_GET_PROP_KEYMAP
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
};
|
|
|
|
|
|
|
|
typedef struct uiStringInfo {
|
|
|
|
int type;
|
|
|
|
char *strinfo;
|
|
|
|
} uiStringInfo;
|
|
|
|
|
|
|
|
/* Note: Expects pointers to uiStringInfo structs as parameters.
|
|
|
|
* Will fill them with translated strings, when possible.
|
|
|
|
* Strings in uiStringInfo must be MEM_freeN'ed by caller. */
|
2013-09-01 15:01:15 +00:00
|
|
|
void uiButGetStrInfo(struct bContext *C, uiBut *but, ...) ATTR_SENTINEL(0);
|
UI translation from inside Blender UI: first part.
This commit reshapes a bit runtime button info getter, by adding a new uiButGetStrInfo() which accepts a variable number of uiStringInfo parameters, and tries to fill them with the requested strings, for the given button (label, tip, context, RNA identifier, keymap, etc.). Currently used mostly by existing ui_tooltip_create(), and new UI_OT_edittranslation_init operator.
It also adds a few getters (to get RNA i18n context, and current language iso code).
Finally, it adds to C operators needed for the py ui_translation addon:
*UI_OT_edittranslation_init, which gathers requested data and launch the py operator.
*UI_OT_reloadtranslation, which forces a full reload of the whole UI translation (including rechecking the directory containing mo files).
For the first operator to work, it also adds a new user preferences path: i18n_branches_directory, to point to the /branch part of a bf-translation checkout.
2012-07-09 14:25:35 +00:00
|
|
|
|
|
|
|
/* Edit i18n stuff. */
|
|
|
|
/* Name of the main py op from i18n addon. */
|
|
|
|
#define EDTSRC_I18N_OP_NAME "UI_OT_edittranslation"
|
|
|
|
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
/* Special Buttons
|
|
|
|
*
|
2012-03-01 12:20:18 +00:00
|
|
|
* Buttons with a more specific purpose:
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
* - MenuBut: buttons that popup a menu (in headers usually).
|
|
|
|
* - PulldownBut: like MenuBut, but creating a uiBlock (for compatibility).
|
|
|
|
* - BlockBut: buttons that popup a block with more buttons.
|
|
|
|
* - KeyevtBut: buttons that can be used to turn key events into values.
|
2009-02-04 11:52:16 +00:00
|
|
|
* - PickerButtons: buttons like the color picker (for code sharing).
|
|
|
|
* - AutoButR: RNA property button with type automatically defined. */
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
|
2013-06-25 10:30:07 +00:00
|
|
|
#define UI_ID_RENAME (1 << 0)
|
|
|
|
#define UI_ID_BROWSE (1 << 1)
|
|
|
|
#define UI_ID_ADD_NEW (1 << 2)
|
|
|
|
#define UI_ID_OPEN (1 << 3)
|
|
|
|
#define UI_ID_ALONE (1 << 4)
|
|
|
|
#define UI_ID_DELETE (1 << 5)
|
|
|
|
#define UI_ID_LOCAL (1 << 6)
|
|
|
|
#define UI_ID_AUTO_NAME (1 << 7)
|
|
|
|
#define UI_ID_FAKE_USER (1 << 8)
|
|
|
|
#define UI_ID_PIN (1 << 9)
|
|
|
|
#define UI_ID_BROWSE_RENDER (1 << 10)
|
|
|
|
#define UI_ID_PREVIEWS (1 << 11)
|
2012-05-12 20:39:39 +00:00
|
|
|
#define UI_ID_FULL (UI_ID_RENAME | UI_ID_BROWSE | UI_ID_ADD_NEW | UI_ID_OPEN | UI_ID_ALONE | UI_ID_DELETE | UI_ID_LOCAL)
|
2.5: ID datablock button back, previously known as std_libbuttons. The
way this worked in 2.4x wasn't really clean, with events going all over
the place and using dubious variables such as G.but->lockpoin or
G.sima->menunr. It works as follows now, for example:
xco= uiDefIDPoinButs(block, CTX_data_main(C), NULL, (ID**)&sima->image, ID_IM, &sima->pin, xco, yco,
sima_idpoin_handle, UI_ID_BROWSE|UI_ID_RENAME|UI_ID_ADD_NEW|UI_ID_OPEN|UI_ID_DELETE|UI_ID_ALONE|UI_ID_PIN);
The last two parameters are a callback function, and a list of events
or functionalities that are supported. The callback function will then
get the ID pointer + event to handle.
2009-02-06 16:40:14 +00:00
|
|
|
|
2009-09-16 01:15:30 +00:00
|
|
|
int uiIconFromID(struct ID *id);
|
2014-02-09 12:00:03 +11:00
|
|
|
int uiIconFromReportType(int type);
|
2009-09-16 01:15:30 +00:00
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefPulldownBut(uiBlock *block, uiBlockCreateFunc func, void *arg, const char *str, int x, int y, short width, short height, const char *tip);
|
|
|
|
uiBut *uiDefMenuBut(uiBlock *block, uiMenuCreateFunc func, void *arg, const char *str, int x, int y, short width, short height, const char *tip);
|
|
|
|
uiBut *uiDefIconTextMenuBut(uiBlock *block, uiMenuCreateFunc func, void *arg, int icon, const char *str, int x, int y, short width, short height, const char *tip);
|
|
|
|
uiBut *uiDefIconMenuBut(uiBlock *block, uiMenuCreateFunc func, void *arg, int icon, int x, int y, short width, short height, const char *tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefBlockBut(uiBlock *block, uiBlockCreateFunc func, void *func_arg1, const char *str, int x, int y, short width, short height, const char *tip);
|
|
|
|
uiBut *uiDefBlockButN(uiBlock *block, uiBlockCreateFunc func, void *argN, const char *str, int x, int y, short width, short height, const char *tip);
|
2009-06-10 11:43:21 +00:00
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefIconBlockBut(uiBlock *block, uiBlockCreateFunc func, void *arg, int retval, int icon, int x, int y, short width, short height, const char *tip);
|
|
|
|
uiBut *uiDefIconTextBlockBut(uiBlock *block, uiBlockCreateFunc func, void *arg, int icon, const char *str, int x, int y, short width, short height, const char *tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefKeyevtButS(uiBlock *block, int retval, const char *str, int x, int y, short width, short height, short *spoin, const char *tip);
|
|
|
|
uiBut *uiDefHotKeyevtButS(uiBlock *block, int retval, const char *str, int x, int y, short width, short height, short *keypoin, short *modkeypoin, const char *tip);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-08-18 18:11:51 +00:00
|
|
|
uiBut *uiDefSearchBut(uiBlock *block, void *arg, int retval, int icon, int maxlen, int x, int y, short width, short height, float a1, float a2, const char *tip);
|
2013-04-15 15:01:12 +00:00
|
|
|
uiBut *uiDefSearchButO_ptr(uiBlock *block, struct wmOperatorType *ot, IDProperty *properties,
|
|
|
|
void *arg, int retval, int icon, int maxlen, int x, int y,
|
|
|
|
short width, short height, float a1, float a2, const char *tip);
|
2009-06-02 18:10:06 +00:00
|
|
|
|
2010-12-03 17:05:21 +00:00
|
|
|
uiBut *uiDefAutoButR(uiBlock *block, struct PointerRNA *ptr, struct PropertyRNA *prop, int index, const char *name, int icon, int x1, int y1, int x2, int y2);
|
2013-03-19 23:17:44 +00:00
|
|
|
int uiDefAutoButsRNA(uiLayout *layout, struct PointerRNA *ptr, bool (*check_prop)(struct PointerRNA *, struct PropertyRNA *), const char label_align);
|
2009-02-04 11:52:16 +00:00
|
|
|
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
/* Links
|
|
|
|
*
|
|
|
|
* Game engine logic brick links. Non-functional currently in 2.5,
|
|
|
|
* code to handle and draw these is disabled internally. */
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
void uiSetButLink(struct uiBut *but, void **poin, void ***ppoin, short *tot, int from, int to);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
void uiComposeLinks(uiBlock *block);
|
|
|
|
uiBut *uiFindInlink(uiBlock *block, void *poin);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-05-12 20:39:39 +00:00
|
|
|
/* use inside searchfunc to add items */
|
2013-04-04 02:05:11 +00:00
|
|
|
bool uiSearchItemAdd(uiSearchItems *items, const char *name, void *poin, int iconid);
|
2012-05-12 20:39:39 +00:00
|
|
|
/* bfunc gets search item *poin as arg2, or if NULL the old string */
|
|
|
|
void uiButSetSearchFunc(uiBut *but, uiButSearchFunc sfunc, void *arg1, uiButHandleFunc bfunc, void *active);
|
|
|
|
/* height in pixels, it's using hardcoded values still */
|
2012-11-26 13:23:37 +00:00
|
|
|
int uiSearchBoxHeight(void);
|
|
|
|
int uiSearchBoxWidth(void);
|
2013-05-02 04:59:52 +00:00
|
|
|
/* check if a string is in an existing search box */
|
|
|
|
int uiSearchItemFindIndex(uiSearchItems *items, const char *name);
|
2009-06-02 18:10:06 +00:00
|
|
|
|
2012-05-12 20:39:39 +00:00
|
|
|
void uiBlockSetHandleFunc(uiBlock *block, uiBlockHandleFunc func, void *arg);
|
|
|
|
void uiBlockSetButmFunc(uiBlock *block, uiMenuHandleFunc func, void *arg);
|
|
|
|
void uiBlockSetFunc(uiBlock *block, uiButHandleFunc func, void *arg1, void *arg2);
|
2013-04-13 16:28:39 +00:00
|
|
|
void uiBlockSetNFunc(uiBlock *block, uiButHandleNFunc funcN, void *argN, void *arg2);
|
2009-07-25 13:40:59 +00:00
|
|
|
|
2012-05-12 20:39:39 +00:00
|
|
|
void uiButSetRenameFunc(uiBut *but, uiButHandleRenameFunc func, void *arg1);
|
|
|
|
void uiButSetFunc(uiBut *but, uiButHandleFunc func, void *arg1, void *arg2);
|
2013-04-13 16:28:39 +00:00
|
|
|
void uiButSetNFunc(uiBut *but, uiButHandleNFunc funcN, void *argN, void *arg2);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-05-12 20:39:39 +00:00
|
|
|
void uiButSetCompleteFunc(uiBut *but, uiButCompleteFunc func, void *arg);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2012-09-04 01:23:50 +00:00
|
|
|
void uiBlockSetDrawExtraFunc(uiBlock *block,
|
|
|
|
void (*func)(const struct bContext *C, void *, void *, void *, struct rcti *rect),
|
|
|
|
void *arg1, void *arg2);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2013-11-22 01:35:38 +01:00
|
|
|
bool UI_textbutton_activate_rna(const struct bContext *C, struct ARegion *ar,
|
|
|
|
const void *rna_poin_data, const char *rna_prop_id);
|
|
|
|
bool UI_textbutton_activate_but(const struct bContext *C, uiBut *but);
|
2013-03-24 13:43:40 +00:00
|
|
|
|
2012-05-12 20:39:39 +00:00
|
|
|
void uiButSetFocusOnEnter(struct wmWindow *win, uiBut *but);
|
2011-02-15 01:24:12 +00:00
|
|
|
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
/* Autocomplete
|
|
|
|
*
|
|
|
|
* Tab complete helper functions, for use in uiButCompleteFunc callbacks.
|
|
|
|
* Call begin once, then multiple times do_name with all possibilities,
|
|
|
|
* and finally end to finish and get the completed name. */
|
|
|
|
|
|
|
|
typedef struct AutoComplete AutoComplete;
|
|
|
|
|
2013-11-19 16:31:48 +01:00
|
|
|
#define AUTOCOMPLETE_NO_MATCH 0
|
|
|
|
#define AUTOCOMPLETE_FULL_MATCH 1
|
|
|
|
#define AUTOCOMPLETE_PARTIAL_MATCH 2
|
|
|
|
|
2011-09-26 17:30:56 +00:00
|
|
|
AutoComplete *autocomplete_begin(const char *startname, size_t maxlen);
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
void autocomplete_do_name(AutoComplete *autocpl, const char *name);
|
2013-11-19 16:31:48 +01:00
|
|
|
int autocomplete_end(AutoComplete *autocpl, char *autoname);
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
|
|
|
|
/* Panels
|
|
|
|
*
|
|
|
|
* Functions for creating, freeing and drawing panels. The API here
|
|
|
|
* could use a good cleanup, though how they will function in 2.5 is
|
|
|
|
* not clear yet so we postpone that. */
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
2009-04-16 21:39:45 +00:00
|
|
|
void uiBeginPanels(const struct bContext *C, struct ARegion *ar);
|
2011-11-20 14:31:01 +00:00
|
|
|
void uiEndPanels(const struct bContext *C, struct ARegion *ar, int *x, int *y);
|
|
|
|
void uiDrawPanels(const struct bContext *C, struct ARegion *ar);
|
2009-04-16 21:39:45 +00:00
|
|
|
|
2013-12-17 03:21:55 +11:00
|
|
|
struct Panel *uiPanelFindByType(struct ARegion *ar, struct PanelType *pt);
|
|
|
|
struct Panel *uiBeginPanel(struct ScrArea *sa, struct ARegion *ar, uiBlock *block,
|
|
|
|
struct PanelType *pt, struct Panel *pa, bool *r_open);
|
2009-04-16 21:39:45 +00:00
|
|
|
void uiEndPanel(uiBlock *block, int width, int height);
|
2012-12-26 13:05:39 +00:00
|
|
|
void uiScalePanels(struct ARegion *ar, float new_width);
|
2009-04-02 01:39:33 +00:00
|
|
|
|
2013-12-17 03:21:55 +11:00
|
|
|
bool UI_panel_category_is_visible(struct ARegion *ar);
|
|
|
|
void UI_panel_category_add(struct ARegion *ar, const char *name);
|
|
|
|
struct PanelCategoryDyn *UI_panel_category_find(struct ARegion *ar, const char *idname);
|
|
|
|
struct PanelCategoryStack *UI_panel_category_active_find(struct ARegion *ar, const char *idname);
|
|
|
|
const char *UI_panel_category_active_get(struct ARegion *ar, bool set_fallback);
|
|
|
|
void UI_panel_category_active_set(struct ARegion *ar, const char *idname);
|
|
|
|
struct PanelCategoryDyn *UI_panel_category_find_mouse_over_ex(struct ARegion *ar, const int x, const int y);
|
|
|
|
struct PanelCategoryDyn *UI_panel_category_find_mouse_over(struct ARegion *ar, const struct wmEvent *event);
|
|
|
|
void UI_panel_category_clear_all(struct ARegion *ar);
|
|
|
|
void UI_panel_category_draw_all(struct ARegion *ar, const char *category_id_active);
|
|
|
|
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
/* Handlers
|
|
|
|
*
|
|
|
|
* Handlers that can be registered in regions, areas and windows for
|
|
|
|
* handling WM events. Mostly this is done automatic by modules such
|
|
|
|
* as screen/ if ED_KEYMAP_UI is set, or internally in popup functions. */
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
|
|
|
|
void UI_add_region_handlers(struct ListBase *handlers);
|
2009-11-23 13:58:55 +00:00
|
|
|
void UI_add_popup_handlers(struct bContext *C, struct ListBase *handlers, uiPopupBlockHandle *popup);
|
|
|
|
void UI_remove_popup_handlers(struct ListBase *handlers, uiPopupBlockHandle *popup);
|
2013-06-02 20:59:00 +00:00
|
|
|
void UI_remove_popup_handlers_all(struct bContext *C, struct ListBase *handlers);
|
UI: don't use operators anymore for handling user interface events, but rather
a special UI handler which makes the code clearer. This UI handler is attached
to the region along with other handlers, and also gets a callback when all
handlers for the region are removed to ensure things are properly cleaned up.
This should fix XXX's in the UI code related to events and context switching.
Most of the changes are in interface_handlers.c, which was renamed from
interface_ops.c, to convert operators to the UI handler. UI code notes:
* uiBeginBlock/uiEndBlock/uiFreeBlocks now takes a context argument, this is
required to properly cancel things like timers or tooltips when the region
gets removed.
* UI_add_region_handlers will add the region level UI handlers, to be used
when adding keymap handlers etc. This replaces the UI keymap.
* When the UI code starts a modal interaction (number sliding, text editing,
opening a menu, ..), it will add an UI handler at the window level which
will block events.
Windowmanager changes:
* Added an UI handler next to the existing keymap and operator modal handlers.
It has an event handling and remove callback, and like operator modal handlers
will remember the area and region if it is registered at the window level.
* Removed the MESSAGE event.
* Operator cancel and UI handler remove callbacks now get the
window/area/region restored in the context, like the operator modal and UI
handler event callbacks.
* Regions now receive MOUSEMOVE events for the mouse going outside of the
region. This was already happening for areas, but UI buttons are at the region
level so we need it there.
Issues:
* Tooltips and menus stay open when switching to another window, and button
highlight doesn't work without moving the mouse first when Blender starts up.
I tried using some events like Q_FIRSTTIME, WINTHAW, but those don't seem to
arrive..
* Timeline header buttons seem to be moving one pixel or so sometimes when
interacting with them.
* Seems not due to this commit, but UI and keymap handlers are leaking. It
seems that handlers are being added to regions in all screens, also in regions
of areas that are not visible, but these handlers are not removed. Probably
there should only be handlers in visible regions?
2008-12-10 04:36:33 +00:00
|
|
|
|
2.5: UI & Menus
* Cleaned up UI_interface.h a bit, and added some comments to
organize things a bit and indicate what should be used when.
* uiMenu* functions can now be used to create menus for headers
too, this is done with a uiDefMenuBut, which takes a pointer
to a uiMenuCreateFunc, that will then call uiMenu* functions.
* Renamed uiMenuBegin/End to uiPupMenuBegin/End, as these are
specific to making popup menus. Will convert the other
conformation popup menu functions to use this too so we can
remove some code.
* Extended uiMenu functions, now there is is also:
BooleanO, FloatO, BooleanR, EnumR, LevelEnumR, Separator.
* Converted image window headers to use uiMenu functions, simplifies
menu code further here. Did not remove the uiDefMenu functions as
they are used in sequencer/view3d in some places now (will fix).
* Also tried to simplify and fix bounds computation a bit better
for popup menus. It tried to find out in advance what the size
of the menu was but this is difficult with keymap strings in
there, now uiPopupBoundsBlock can figure this out afterwards and
ensure the popup is within the window bounds. Will convert some
other functions to use this too.
2009-01-30 12:18:08 +00:00
|
|
|
/* Module
|
|
|
|
*
|
|
|
|
* init and exit should be called before using this module. init_userdef must
|
|
|
|
* be used to reinitialize some internal state if user preferences change. */
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
|
|
|
|
void UI_init(void);
|
|
|
|
void UI_init_userdef(void);
|
2013-11-25 11:45:12 +11:00
|
|
|
void UI_init_userdef_factory(void);
|
2011-09-21 15:15:30 +00:00
|
|
|
void UI_reinit_font(void);
|
Port of part of the Interface code to 2.50.
This is based on the current trunk version, so these files should not need
merges. There's two things (clipboard and intptr_t) that are missing in 2.50
and commented out with XXX 2.48, these can be enabled again once trunk is
merged into this branch.
Further this is not all interface code, there are many parts commented out:
* interface.c: nearly all button types, missing: links, chartab, keyevent.
* interface_draw.c: almost all code, with some small exceptions.
* interface_ops.c: this replaces ui_do_but and uiDoBlocks with two operators,
making it non-blocking.
* interface_regions: this is a part of interface.c, split off, contains code to
create regions for tooltips, menus, pupmenu (that one is crashing currently),
color chooser, basically regions with buttons which is fairly independent of
core interface code.
* interface_panel.c and interface_icons.c: not ported over, so no panels and
icons yet. Panels should probably become (free floating) regions?
* text.c: (formerly language.c) for drawing text and translation. this works
but is using bad globals still and could be cleaned up.
Header Files:
* ED_datafiles.h now has declarations for datatoc_ files, so those extern
declarations can be #included instead of repeated.
* The user interface code is in UI_interface.h and other UI_* files.
Core:
* The API for creating blocks, buttons, etc is nearly the same still. Blocks
are now created per region instead of per area.
* The code was made non-blocking, which means that any changes and redraws
should be possible while editing a button. That means though that we need
some sort of persistence even though the blender model is to recreate buttons
for each redraw. So when a new block is created, some matching happens to
find out which buttons correspond to buttons in the previously created block,
and for activated buttons some data is then copied over to the new button.
* Added UI_init/UI_init_userdef/UI_exit functions that should initialize code
in this module, instead of multiple function calls in the windowmanager.
* Removed most static/globals from interface.c.
* Removed UIafterfunc_ I don't think it's needed anymore, and not sure how it
would integrate here?
* Currently only full window redraws are used, this should become per region
and maybe per button later.
Operators:
* Events are currently handled through two operators: button activate and menu
handle. Operators may not be the best way to implement this, since there are
currently some issues with events being missed, but they can become a special
handler type instead, this should not be a big change.
* The button activate operator runs as long as a button is active, and will
handle all interaction with that button until the button is not activated
anymore. This means clicking, text editing, number dragging, opening menu
blocks, etc.
* Since this operator has to be non-blocking, the ui_do_but code needed to made
non-blocking. That means variables that were previously on the stack, now
need to be stored away in a struct such that they can be accessed again when
the operator receives more events.
* Additionally the place in the ui_do_but code indicated the state, now that
needs to be set explicit in order to handle the right events in the right
state. So an activated button can be in one of these states: init, highlight,
wait_flash, wait_release, wait_key_event, num_editing, text_editing,
text_selecting, block_open, exit.
* For each button type an ui_apply_but_* function has also been separated out
from ui_do_but. This makes it possible to continuously apply the button as
text is being typed for example, and there is an option in the code to enable
this. Since the code non-blocking and can deal with the button being deleted
even, it should be safe to do this.
* When editing text, dragging numbers, etc, the actual data (but->poin) is not
being edited, since that would mean data is being edited without correct
updates happening, while some other part of blender may be accessing that
data in the meantime. So data values, strings, vectors are written to a
temporary location and only flush in the apply function.
Regions:
* Menus, color chooser, tooltips etc all create screen level regions. Such menu
blocks give a handle to the button that creates it, which will contain the
results of the menu block once a MESSAGE event is received from that menu
block.
* For this type of menu block the coordinates used to be in window space. They
are still created that way and ui_positionblock still works with window
coordinates, but after that the block and buttons are brought back to region
coordinates since these are now contained in a region.
* The flush/overdraw frontbuffer drawing code was removed, the windowmanager
should have enough information with these screen level regions to have full
control over what gets drawn when and to then do correct compositing.
Testing:
* The header in the time space currently has some buttons to test the UI code.
2008-11-11 18:31:32 +00:00
|
|
|
void UI_exit(void);
|
|
|
|
|
2009-03-13 13:38:41 +00:00
|
|
|
/* Layout
|
|
|
|
*
|
|
|
|
* More automated layout of buttons. Has three levels:
|
|
|
|
* - Layout: contains a number templates, within a bounded width or height.
|
|
|
|
* - Template: predefined layouts for buttons with a number of slots, each
|
|
|
|
* slot can contain multiple items.
|
|
|
|
* - Item: item to put in a template slot, being either an RNA property,
|
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
|
|
|
* operator, label or menu. Also regular buttons can be used when setting
|
|
|
|
* uiBlockCurLayout. */
|
2009-03-13 13:38:41 +00:00
|
|
|
|
|
|
|
/* layout */
|
2012-05-12 20:39:39 +00:00
|
|
|
#define UI_LAYOUT_HORIZONTAL 0
|
|
|
|
#define UI_LAYOUT_VERTICAL 1
|
|
|
|
|
|
|
|
#define UI_LAYOUT_PANEL 0
|
|
|
|
#define UI_LAYOUT_HEADER 1
|
|
|
|
#define UI_LAYOUT_MENU 2
|
|
|
|
#define UI_LAYOUT_TOOLBAR 3
|
|
|
|
|
2013-12-24 17:20:37 +11:00
|
|
|
#define UI_UNIT_X ((void)0, U.widget_unit)
|
|
|
|
#define UI_UNIT_Y ((void)0, U.widget_unit)
|
2012-05-12 20:39:39 +00:00
|
|
|
|
|
|
|
#define UI_LAYOUT_ALIGN_EXPAND 0
|
|
|
|
#define UI_LAYOUT_ALIGN_LEFT 1
|
|
|
|
#define UI_LAYOUT_ALIGN_CENTER 2
|
|
|
|
#define UI_LAYOUT_ALIGN_RIGHT 3
|
|
|
|
|
2013-06-25 10:30:07 +00:00
|
|
|
#define UI_ITEM_O_RETURN_PROPS (1 << 0)
|
|
|
|
#define UI_ITEM_R_EXPAND (1 << 1)
|
|
|
|
#define UI_ITEM_R_SLIDER (1 << 2)
|
|
|
|
#define UI_ITEM_R_TOGGLE (1 << 3)
|
|
|
|
#define UI_ITEM_R_ICON_ONLY (1 << 4)
|
|
|
|
#define UI_ITEM_R_EVENT (1 << 5)
|
|
|
|
#define UI_ITEM_R_FULL_EVENT (1 << 6)
|
|
|
|
#define UI_ITEM_R_NO_BG (1 << 7)
|
|
|
|
#define UI_ITEM_R_IMMEDIATE (1 << 8)
|
2009-08-21 12:57:47 +00:00
|
|
|
|
2010-12-15 05:42:23 +00:00
|
|
|
/* uiLayoutOperatorButs flags */
|
|
|
|
#define UI_LAYOUT_OP_SHOW_TITLE 1
|
|
|
|
#define UI_LAYOUT_OP_SHOW_EMPTY 2
|
|
|
|
|
2013-12-24 17:20:37 +11:00
|
|
|
/* used for transp checkers */
|
|
|
|
#define UI_ALPHA_CHECKER_DARK 100
|
|
|
|
#define UI_ALPHA_CHECKER_LIGHT 160
|
|
|
|
|
2011-09-11 06:41:09 +00:00
|
|
|
/* flags to set which corners will become rounded:
|
|
|
|
*
|
|
|
|
* 1------2
|
|
|
|
* | |
|
|
|
|
* 8------4 */
|
|
|
|
|
|
|
|
enum {
|
2013-06-25 10:30:07 +00:00
|
|
|
UI_CNR_TOP_LEFT = (1 << 0),
|
|
|
|
UI_CNR_TOP_RIGHT = (1 << 1),
|
|
|
|
UI_CNR_BOTTOM_RIGHT = (1 << 2),
|
|
|
|
UI_CNR_BOTTOM_LEFT = (1 << 3),
|
2011-09-11 06:41:09 +00:00
|
|
|
/* just for convenience */
|
2013-06-25 10:30:07 +00:00
|
|
|
UI_CNR_NONE = 0,
|
|
|
|
UI_CNR_ALL = (UI_CNR_TOP_LEFT | UI_CNR_TOP_RIGHT | UI_CNR_BOTTOM_RIGHT | UI_CNR_BOTTOM_LEFT)
|
2011-09-11 06:41:09 +00:00
|
|
|
};
|
|
|
|
|
|
|
|
/* not apart of the corner flags but mixed in some functions */
|
|
|
|
#define UI_RB_ALPHA (UI_CNR_ALL + 1)
|
|
|
|
|
2014-01-17 00:23:00 +01:00
|
|
|
uiLayout *uiBlockLayout(uiBlock *block, int dir, int type, int x, int y, int size, int em, int padding, struct uiStyle *style);
|
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
|
|
|
void uiBlockSetCurLayout(uiBlock *block, uiLayout *layout);
|
2009-09-16 18:47:42 +00:00
|
|
|
void uiBlockLayoutResolve(uiBlock *block, int *x, int *y);
|
2009-03-13 13:38:41 +00:00
|
|
|
|
2009-05-28 23:37:55 +00:00
|
|
|
uiBlock *uiLayoutGetBlock(uiLayout *layout);
|
2009-04-22 18:39:44 +00:00
|
|
|
|
2009-05-28 23:37:55 +00:00
|
|
|
void uiLayoutSetFunc(uiLayout *layout, uiMenuHandleFunc handlefunc, void *argv);
|
2010-11-17 09:45:45 +00:00
|
|
|
void uiLayoutSetContextPointer(uiLayout *layout, const char *name, struct PointerRNA *ptr);
|
2012-04-03 15:18:59 +00:00
|
|
|
void uiLayoutContextCopy(uiLayout *layout, struct bContextStore *context);
|
2010-11-17 09:45:45 +00:00
|
|
|
const char *uiLayoutIntrospect(uiLayout *layout); // XXX - testing
|
2012-09-04 18:47:08 +00:00
|
|
|
void uiLayoutOperatorButs(const struct bContext *C, struct uiLayout *layout, struct wmOperator *op,
|
2013-03-19 23:17:44 +00:00
|
|
|
bool (*check_prop)(struct PointerRNA *, struct PropertyRNA *),
|
2012-09-04 18:47:08 +00:00
|
|
|
const char label_align, const short flag);
|
2011-11-14 14:42:47 +00:00
|
|
|
struct MenuType *uiButGetMenuType(uiBut *but);
|
2009-05-28 23:13:42 +00:00
|
|
|
|
2009-08-21 12:57:47 +00:00
|
|
|
void uiLayoutSetOperatorContext(uiLayout *layout, int opcontext);
|
2013-04-04 02:05:11 +00:00
|
|
|
void uiLayoutSetActive(uiLayout *layout, bool active);
|
|
|
|
void uiLayoutSetEnabled(uiLayout *layout, bool enabled);
|
|
|
|
void uiLayoutSetRedAlert(uiLayout *layout, bool redalert);
|
|
|
|
void uiLayoutSetAlignment(uiLayout *layout, char alignment);
|
|
|
|
void uiLayoutSetKeepAspect(uiLayout *layout, bool keepaspect);
|
2009-06-03 00:04:48 +00:00
|
|
|
void uiLayoutSetScaleX(uiLayout *layout, float scale);
|
|
|
|
void uiLayoutSetScaleY(uiLayout *layout, float scale);
|
2009-05-28 23:37:55 +00:00
|
|
|
|
2009-06-09 10:30:11 +00:00
|
|
|
int uiLayoutGetOperatorContext(uiLayout *layout);
|
2013-04-04 02:05:11 +00:00
|
|
|
bool uiLayoutGetActive(uiLayout *layout);
|
|
|
|
bool uiLayoutGetEnabled(uiLayout *layout);
|
|
|
|
bool uiLayoutGetRedAlert(uiLayout *layout);
|
2009-05-28 23:37:55 +00:00
|
|
|
int uiLayoutGetAlignment(uiLayout *layout);
|
2013-04-04 02:05:11 +00:00
|
|
|
bool uiLayoutGetKeepAspect(uiLayout *layout);
|
2009-08-16 13:01:40 +00:00
|
|
|
int uiLayoutGetWidth(uiLayout *layout);
|
2009-06-03 00:04:48 +00:00
|
|
|
float uiLayoutGetScaleX(uiLayout *layout);
|
|
|
|
float uiLayoutGetScaleY(uiLayout *layout);
|
2009-05-28 23:37:55 +00:00
|
|
|
|
2009-04-16 12:17:58 +00:00
|
|
|
/* layout specifiers */
|
UI: Layout Engine
* Buttons are now created first, and after that the layout is computed.
This means the layout engine now works at button level, and makes it
easier to write templates. Otherwise you had to store all info and
create the buttons later.
* Added interface_templates.c as a separate file to put templates in.
These can contain regular buttons, and can be put in a Free layout,
which means you can specify manual coordinates, but still get nested
correct inside other layouts.
* API was changed to allow better nesting. Previously items were added
in the last added layout specifier, i.e. one level up in the layout
hierarchy. This doesn't work well in always, so now when creating things
like rows or columns it always returns a layout which you have to add
the items in. All py scripts were updated to follow this.
* Computing the layout now goes in two passes, first estimating the
required width/height of all nested layouts, and then in the second
pass using the results of that to decide on the actual locations.
* Enum and array buttons now follow the direction of the layout, i.e.
they are vertical or horizontal depending if they are in a column or row.
* Color properties now get a color picker, and only get the additional
RGB sliders with Expand=True.
* File/directory string properties now get a button next to them for
opening the file browse, though this is not implemented yet.
* Layout items can now be aligned, set align=True when creating a column,
row, etc.
* Buttons now get a minimum width of one icon (avoids squashing icon
buttons).
* Moved some more space variables into Style.
2009-05-15 11:19:59 +00:00
|
|
|
uiLayout *uiLayoutRow(uiLayout *layout, int align);
|
|
|
|
uiLayout *uiLayoutColumn(uiLayout *layout, int align);
|
|
|
|
uiLayout *uiLayoutColumnFlow(uiLayout *layout, int number, int align);
|
2009-04-16 12:17:58 +00:00
|
|
|
uiLayout *uiLayoutBox(uiLayout *layout);
|
2012-12-28 10:32:49 +00:00
|
|
|
uiLayout *uiLayoutListBox(uiLayout *layout, struct uiList *ui_list, struct PointerRNA *ptr, struct PropertyRNA *prop,
|
2012-05-12 20:39:39 +00:00
|
|
|
struct PointerRNA *actptr, struct PropertyRNA *actprop);
|
2009-10-21 20:58:10 +00:00
|
|
|
uiLayout *uiLayoutAbsolute(uiLayout *layout, int align);
|
2009-12-10 14:47:07 +00:00
|
|
|
uiLayout *uiLayoutSplit(uiLayout *layout, float percentage, int align);
|
2009-10-21 20:58:10 +00:00
|
|
|
uiLayout *uiLayoutOverlap(uiLayout *layout);
|
2009-03-13 13:38:41 +00:00
|
|
|
|
2009-10-09 10:45:11 +00:00
|
|
|
uiBlock *uiLayoutAbsoluteBlock(uiLayout *layout);
|
2009-05-19 17:13:33 +00:00
|
|
|
|
2009-04-16 12:17:58 +00:00
|
|
|
/* templates */
|
2014-01-27 18:38:53 +11:00
|
|
|
void uiTemplateHeader(uiLayout *layout, struct bContext *C);
|
2010-11-17 09:45:45 +00:00
|
|
|
void uiTemplateID(uiLayout *layout, struct bContext *C, struct PointerRNA *ptr, const char *propname,
|
2012-05-12 20:39:39 +00:00
|
|
|
const char *newop, const char *openop, const char *unlinkop);
|
2010-11-17 09:45:45 +00:00
|
|
|
void uiTemplateIDBrowse(uiLayout *layout, struct bContext *C, struct PointerRNA *ptr, const char *propname,
|
2011-11-11 13:09:14 +00:00
|
|
|
const char *newop, const char *openop, const char *unlinkop);
|
2010-11-17 09:45:45 +00:00
|
|
|
void uiTemplateIDPreview(uiLayout *layout, struct bContext *C, struct PointerRNA *ptr, const char *propname,
|
2012-05-12 20:39:39 +00:00
|
|
|
const char *newop, const char *openop, const char *unlinkop, int rows, int cols);
|
2010-11-17 09:45:45 +00:00
|
|
|
void uiTemplateAnyID(uiLayout *layout, struct PointerRNA *ptr, const char *propname,
|
2012-05-12 20:39:39 +00:00
|
|
|
const char *proptypename, const char *text);
|
2010-11-17 09:45:45 +00:00
|
|
|
void uiTemplatePathBuilder(uiLayout *layout, struct PointerRNA *ptr, const char *propname,
|
2012-05-12 20:39:39 +00:00
|
|
|
struct PointerRNA *root_ptr, const char *text);
|
2010-08-16 23:28:20 +00:00
|
|
|
uiLayout *uiTemplateModifier(uiLayout *layout, struct bContext *C, struct PointerRNA *ptr);
|
|
|
|
uiLayout *uiTemplateConstraint(uiLayout *layout, struct PointerRNA *ptr);
|
2010-11-19 07:46:23 +00:00
|
|
|
void uiTemplatePreview(uiLayout *layout, struct ID *id, int show_buttons, struct ID *parent, struct MTex *slot);
|
2010-11-17 09:45:45 +00:00
|
|
|
void uiTemplateColorRamp(uiLayout *layout, struct PointerRNA *ptr, const char *propname, int expand);
|
2013-01-22 11:18:41 +00:00
|
|
|
void uiTemplateIconView(uiLayout *layout, struct PointerRNA *ptr, const char *propname);
|
2010-11-17 09:45:45 +00:00
|
|
|
void uiTemplateHistogram(uiLayout *layout, struct PointerRNA *ptr, const char *propname);
|
|
|
|
void uiTemplateWaveform(uiLayout *layout, struct PointerRNA *ptr, const char *propname);
|
|
|
|
void uiTemplateVectorscope(uiLayout *layout, struct PointerRNA *ptr, const char *propname);
|
|
|
|
void uiTemplateCurveMapping(uiLayout *layout, struct PointerRNA *ptr, const char *propname, int type, int levels, int brush);
|
2012-11-09 11:03:53 +00:00
|
|
|
void uiTemplateColorPicker(uiLayout *layout, struct PointerRNA *ptr, const char *propname, int value_slider, int lock, int lock_luminosity, int cubic);
|
2010-11-17 09:45:45 +00:00
|
|
|
void uiTemplateLayers(uiLayout *layout, struct PointerRNA *ptr, const char *propname,
|
2011-11-11 13:09:14 +00:00
|
|
|
PointerRNA *used_ptr, const char *used_propname, int active_layer);
|
2012-09-21 04:09:09 +00:00
|
|
|
void uiTemplateGameStates(uiLayout *layout, struct PointerRNA *ptr, const char *propname,
|
|
|
|
PointerRNA *used_ptr, const char *used_propname, int active_state);
|
2010-11-17 09:45:45 +00:00
|
|
|
void uiTemplateImage(uiLayout *layout, struct bContext *C, struct PointerRNA *ptr, const char *propname, struct PointerRNA *userptr, int compact);
|
Color Management, Stage 2: Switch color pipeline to use OpenColorIO
Replace old color pipeline which was supporting linear/sRGB color spaces
only with OpenColorIO-based pipeline.
This introduces two configurable color spaces:
- Input color space for images and movie clips. This space is used to convert
images/movies from color space in which file is saved to Blender's linear
space (for float images, byte images are not internally converted, only input
space is stored for such images and used later).
This setting could be found in image/clip data block settings.
- Display color space which defines space in which particular display is working.
This settings could be found in scene's Color Management panel.
When render result is being displayed on the screen, apart from converting image
to display space, some additional conversions could happen.
This conversions are:
- View, which defines tone curve applying before display transformation.
These are different ways to view the image on the same display device.
For example it could be used to emulate film view on sRGB display.
- Exposure affects on image exposure before tone map is applied.
- Gamma is post-display gamma correction, could be used to match particular
display gamma.
- RGB curves are user-defined curves which are applying before display
transformation, could be used for different purposes.
All this settings by default are only applying on render result and does not
affect on other images. If some particular image needs to be affected by this
transformation, "View as Render" setting of image data block should be set to
truth. Movie clips are always affected by all display transformations.
This commit also introduces configurable color space in which sequencer is
working. This setting could be found in scene's Color Management panel and
it should be used if such stuff as grading needs to be done in color space
different from sRGB (i.e. when Film view on sRGB display is use, using VD16
space as sequencer's internal space would make grading working in space
which is close to the space using for display).
Some technical notes:
- Image buffer's float buffer is now always in linear space, even if it was
created from 16bit byte images.
- Space of byte buffer is stored in image buffer's rect_colorspace property.
- Profile of image buffer was removed since it's not longer meaningful.
- OpenGL and GLSL is supposed to always work in sRGB space. It is possible
to support other spaces, but it's quite large project which isn't so
much important.
- Legacy Color Management option disabled is emulated by using None display.
It could have some regressions, but there's no clear way to avoid them.
- If OpenColorIO is disabled on build time, it should make blender behaving
in the same way as previous release with color management enabled.
More details could be found at this page (more details would be added soon):
http://wiki.blender.org/index.php/Dev:Ref/Release_Notes/2.64/Color_Management
--
Thanks to Xavier Thomas, Lukas Toene for initial work on OpenColorIO
integration and to Brecht van Lommel for some further development and code/
usecase review!
2012-09-15 10:05:07 +00:00
|
|
|
void uiTemplateImageSettings(uiLayout *layout, struct PointerRNA *imfptr, int color_management);
|
2009-06-23 00:45:41 +00:00
|
|
|
void uiTemplateImageLayers(uiLayout *layout, struct bContext *C, struct Image *ima, struct ImageUser *iuser);
|
2009-06-30 19:20:45 +00:00
|
|
|
void uiTemplateRunningJobs(uiLayout *layout, struct bContext *C);
|
2012-12-04 18:22:41 +00:00
|
|
|
void uiOperatorSearch_But(uiBut *but);
|
2009-06-30 19:20:45 +00:00
|
|
|
void uiTemplateOperatorSearch(uiLayout *layout);
|
2009-07-11 13:32:20 +00:00
|
|
|
void uiTemplateHeader3D(uiLayout *layout, struct bContext *C);
|
2011-05-02 11:34:57 +00:00
|
|
|
void uiTemplateEditModeSelection(uiLayout *layout, struct bContext *C);
|
2010-06-03 07:27:55 +00:00
|
|
|
void uiTemplateReportsBanner(uiLayout *layout, struct bContext *C);
|
2011-10-04 13:24:48 +00:00
|
|
|
void uiTemplateKeymapItemProperties(uiLayout *layout, struct PointerRNA *ptr);
|
2013-03-18 16:34:57 +00:00
|
|
|
void uiTemplateComponentMenu(uiLayout *layout, struct PointerRNA *ptr, const char *propname, const char *name);
|
|
|
|
void uiTemplateNodeSocket(uiLayout *layout, struct bContext *C, float *color);
|
2009-03-13 13:38:41 +00:00
|
|
|
|
2013-02-18 14:03:26 +00:00
|
|
|
/* Default UIList class name, keep in sync with its declaration in bl_ui/__init__.py */
|
|
|
|
#define UI_UL_DEFAULT_CLASS_NAME "UI_UL_list"
|
This commit frees list ui items from their dependencies to Panel, and hence from all the limitations this implied (mostly, the "only one list per panel" one).
It introduces a new (py-extendable and registrable) RNA type, UIList (roughly similar to Panel one), which currently contains only "standard" list's scroll pos and size (but may be expended to include e.g. some filtering data, etc.). This now makes lists completely independent from Panels!
This UIList has a draw_item callback which allows to customize items' drawing from python, that all addons can now use. Incidentally, this also greatly simplifies the C code of this widget, as we do not code any "special case" here anymore!
To make all this work, other changes were also necessary:
* Now all buttons (uiBut struct) have a 'custom_data' void pointer, used currently to store the uiList struct associated with a given uiLayoutListBox.
* DynamicPaintSurface now exposes a new bool, use_color_preview (readonly), saying whether that surface has some 3D view preview data or not.
* UILayout class has now four new (static) functions, to get the actual icon of any RNA object (important e.g. with materials or textures), and to get an enum item's UI name, description and icon.
* UILayout's label() func now takes an optional 'icon_value' integer parameter, which if not zero will override the 'icon' one (mandatory to use "custom" icons as generated for material/texture/... previews).
Note: not sure whether we should add that one to all UILayout's prop funcs?
Note: will update addons using template list asap.
2012-12-28 09:20:16 +00:00
|
|
|
void uiTemplateList(uiLayout *layout, struct bContext *C, const char *listtype_name, const char *list_id,
|
|
|
|
struct PointerRNA *dataptr, const char *propname, struct PointerRNA *active_dataptr,
|
2013-08-27 15:27:41 +00:00
|
|
|
const char *active_propname, int rows, int maxrows, int layout_type, int columns);
|
2011-11-07 22:28:49 +00:00
|
|
|
void uiTemplateNodeLink(uiLayout *layout, struct bNodeTree *ntree, struct bNode *node, struct bNodeSocket *input);
|
|
|
|
void uiTemplateNodeView(uiLayout *layout, struct bContext *C, struct bNodeTree *ntree, struct bNode *node, struct bNodeSocket *input);
|
2011-11-08 13:07:16 +00:00
|
|
|
void uiTemplateTextureUser(uiLayout *layout, struct bContext *C);
|
|
|
|
void uiTemplateTextureShow(uiLayout *layout, struct bContext *C, struct PointerRNA *ptr, struct PropertyRNA *prop);
|
2009-07-21 01:26:17 +00:00
|
|
|
|
2011-11-07 12:55:18 +00:00
|
|
|
void uiTemplateMovieClip(struct uiLayout *layout, struct bContext *C, struct PointerRNA *ptr, const char *propname, int compact);
|
|
|
|
void uiTemplateTrack(struct uiLayout *layout, struct PointerRNA *ptr, const char *propname);
|
|
|
|
void uiTemplateMarker(struct uiLayout *layout, struct PointerRNA *ptr, const char *propname, PointerRNA *userptr, PointerRNA *trackptr, int cmpact);
|
2013-04-23 05:29:06 +00:00
|
|
|
void uiTemplateMovieclipInformation(struct uiLayout *layout, struct PointerRNA *ptr, const char *propname, struct PointerRNA *userptr);
|
2011-11-07 12:55:18 +00:00
|
|
|
|
Color Management, Stage 2: Switch color pipeline to use OpenColorIO
Replace old color pipeline which was supporting linear/sRGB color spaces
only with OpenColorIO-based pipeline.
This introduces two configurable color spaces:
- Input color space for images and movie clips. This space is used to convert
images/movies from color space in which file is saved to Blender's linear
space (for float images, byte images are not internally converted, only input
space is stored for such images and used later).
This setting could be found in image/clip data block settings.
- Display color space which defines space in which particular display is working.
This settings could be found in scene's Color Management panel.
When render result is being displayed on the screen, apart from converting image
to display space, some additional conversions could happen.
This conversions are:
- View, which defines tone curve applying before display transformation.
These are different ways to view the image on the same display device.
For example it could be used to emulate film view on sRGB display.
- Exposure affects on image exposure before tone map is applied.
- Gamma is post-display gamma correction, could be used to match particular
display gamma.
- RGB curves are user-defined curves which are applying before display
transformation, could be used for different purposes.
All this settings by default are only applying on render result and does not
affect on other images. If some particular image needs to be affected by this
transformation, "View as Render" setting of image data block should be set to
truth. Movie clips are always affected by all display transformations.
This commit also introduces configurable color space in which sequencer is
working. This setting could be found in scene's Color Management panel and
it should be used if such stuff as grading needs to be done in color space
different from sRGB (i.e. when Film view on sRGB display is use, using VD16
space as sequencer's internal space would make grading working in space
which is close to the space using for display).
Some technical notes:
- Image buffer's float buffer is now always in linear space, even if it was
created from 16bit byte images.
- Space of byte buffer is stored in image buffer's rect_colorspace property.
- Profile of image buffer was removed since it's not longer meaningful.
- OpenGL and GLSL is supposed to always work in sRGB space. It is possible
to support other spaces, but it's quite large project which isn't so
much important.
- Legacy Color Management option disabled is emulated by using None display.
It could have some regressions, but there's no clear way to avoid them.
- If OpenColorIO is disabled on build time, it should make blender behaving
in the same way as previous release with color management enabled.
More details could be found at this page (more details would be added soon):
http://wiki.blender.org/index.php/Dev:Ref/Release_Notes/2.64/Color_Management
--
Thanks to Xavier Thomas, Lukas Toene for initial work on OpenColorIO
integration and to Brecht van Lommel for some further development and code/
usecase review!
2012-09-15 10:05:07 +00:00
|
|
|
void uiTemplateColorspaceSettings(struct uiLayout *layout, struct PointerRNA *ptr, const char *propname);
|
|
|
|
void uiTemplateColormanagedViewSettings(struct uiLayout *layout, struct bContext *C, struct PointerRNA *ptr, const char *propname);
|
|
|
|
|
2009-03-13 13:38:41 +00:00
|
|
|
/* items */
|
2010-11-17 09:45:45 +00:00
|
|
|
void uiItemO(uiLayout *layout, const char *name, int icon, const char *opname);
|
2012-01-22 03:30:07 +00:00
|
|
|
void uiItemEnumO_ptr(uiLayout *layout, struct wmOperatorType *ot, const char *name, int icon, const char *propname, int value);
|
2010-12-03 17:05:21 +00:00
|
|
|
void uiItemEnumO(uiLayout *layout, const char *opname, const char *name, int icon, const char *propname, int value);
|
2011-03-24 12:36:12 +00:00
|
|
|
void uiItemEnumO_value(uiLayout *layout, const char *name, int icon, const char *opname, const char *propname, int value);
|
2010-12-03 17:05:21 +00:00
|
|
|
void uiItemEnumO_string(uiLayout *layout, const char *name, int icon, const char *opname, const char *propname, const char *value);
|
2010-11-17 09:45:45 +00:00
|
|
|
void uiItemsEnumO(uiLayout *layout, const char *opname, const char *propname);
|
2010-12-03 17:05:21 +00:00
|
|
|
void uiItemBooleanO(uiLayout *layout, const char *name, int icon, const char *opname, const char *propname, int value);
|
2010-11-17 09:45:45 +00:00
|
|
|
void uiItemIntO(uiLayout *layout, const char *name, int icon, const char *opname, const char *propname, int value);
|
|
|
|
void uiItemFloatO(uiLayout *layout, const char *name, int icon, const char *opname, const char *propname, float value);
|
2010-12-03 17:05:21 +00:00
|
|
|
void uiItemStringO(uiLayout *layout, const char *name, int icon, const char *opname, const char *propname, const char *value);
|
2012-01-22 03:30:07 +00:00
|
|
|
|
|
|
|
PointerRNA uiItemFullO_ptr(uiLayout *layout, struct wmOperatorType *ot, const char *name, int icon, IDProperty *properties, int context, int flag);
|
2010-11-17 09:45:45 +00:00
|
|
|
PointerRNA uiItemFullO(uiLayout *layout, const char *idname, const char *name, int icon, struct IDProperty *properties, int context, int flag);
|
|
|
|
|
|
|
|
void uiItemR(uiLayout *layout, struct PointerRNA *ptr, const char *propname, int flag, const char *name, int icon);
|
|
|
|
void uiItemFullR(uiLayout *layout, struct PointerRNA *ptr, struct PropertyRNA *prop, int index, int value, int flag, const char *name, int icon);
|
|
|
|
void uiItemEnumR(uiLayout *layout, const char *name, int icon, struct PointerRNA *ptr, const char *propname, int value);
|
|
|
|
void uiItemEnumR_string(uiLayout *layout, struct PointerRNA *ptr, const char *propname, const char *value, const char *name, int icon);
|
|
|
|
void uiItemsEnumR(uiLayout *layout, struct PointerRNA *ptr, const char *propname);
|
|
|
|
void uiItemPointerR(uiLayout *layout, struct PointerRNA *ptr, const char *propname, struct PointerRNA *searchptr, const char *searchpropname, const char *name, int icon);
|
|
|
|
void uiItemsFullEnumO(uiLayout *layout, const char *opname, const char *propname, struct IDProperty *properties, int context, int flag);
|
|
|
|
|
|
|
|
void uiItemL(uiLayout *layout, const char *name, int icon); /* label */
|
2010-12-03 17:05:21 +00:00
|
|
|
void uiItemLDrag(uiLayout *layout, struct PointerRNA *ptr, const char *name, int icon); /* label icon for dragging */
|
2010-11-17 09:45:45 +00:00
|
|
|
void uiItemM(uiLayout *layout, struct bContext *C, const char *menuname, const char *name, int icon); /* menu */
|
2010-12-03 17:05:21 +00:00
|
|
|
void uiItemV(uiLayout *layout, const char *name, int icon, int argval); /* value */
|
2009-04-22 18:39:44 +00:00
|
|
|
void uiItemS(uiLayout *layout); /* separator */
|
|
|
|
|
2010-11-17 09:45:45 +00:00
|
|
|
void uiItemMenuF(uiLayout *layout, const char *name, int icon, uiMenuCreateFunc func, void *arg);
|
2013-05-10 23:41:41 +00:00
|
|
|
void uiItemMenuEnumO(uiLayout *layout, struct bContext *C, const char *opname, const char *propname, const char *name, int icon);
|
2010-11-17 09:45:45 +00:00
|
|
|
void uiItemMenuEnumR(uiLayout *layout, struct PointerRNA *ptr, const char *propname, const char *name, int icon);
|
2009-03-13 13:38:41 +00:00
|
|
|
|
2009-12-17 10:47:55 +00:00
|
|
|
/* UI Operators */
|
2009-12-17 17:15:38 +00:00
|
|
|
void UI_buttons_operatortypes(void);
|
2009-12-17 10:47:55 +00:00
|
|
|
|
2009-07-28 18:51:06 +00:00
|
|
|
/* Helpers for Operators */
|
2011-10-23 04:13:56 +00:00
|
|
|
uiBut *uiContextActiveButton(const struct bContext *C);
|
2010-09-25 14:32:26 +00:00
|
|
|
void uiContextActiveProperty(const struct bContext *C, struct PointerRNA *ptr, struct PropertyRNA **prop, int *index);
|
2011-09-28 18:45:17 +00:00
|
|
|
void uiContextActivePropertyHandle(struct bContext *C);
|
2011-11-10 03:44:50 +00:00
|
|
|
struct wmOperator *uiContextActiveOperator(const struct bContext *C);
|
2010-09-25 14:32:26 +00:00
|
|
|
void uiContextAnimUpdate(const struct bContext *C);
|
2009-07-28 18:51:06 +00:00
|
|
|
void uiFileBrowseContextProperty(const struct bContext *C, struct PointerRNA *ptr, struct PropertyRNA **prop);
|
2009-10-01 23:32:57 +00:00
|
|
|
void uiIDContextProperty(struct bContext *C, struct PointerRNA *ptr, struct PropertyRNA **prop);
|
2009-04-03 23:30:32 +00:00
|
|
|
|
2009-04-09 18:11:18 +00:00
|
|
|
/* Styled text draw */
|
2009-04-10 14:06:24 +00:00
|
|
|
void uiStyleFontSet(struct uiFontStyle *fs);
|
2012-12-13 02:40:49 +00:00
|
|
|
void uiStyleFontDrawExt(struct uiFontStyle *fs, const struct rcti *rect, const char *str,
|
2013-12-02 20:33:45 +11:00
|
|
|
size_t len, float *r_xofs, float *r_yofs);
|
2013-12-07 15:47:57 +11:00
|
|
|
void uiStyleFontDraw(struct uiFontStyle *fs, const struct rcti *rect, const char *str);
|
|
|
|
void uiStyleFontDrawRotated(struct uiFontStyle *fs, const struct rcti *rect, const char *str);
|
2009-04-09 18:11:18 +00:00
|
|
|
|
2010-11-17 09:45:45 +00:00
|
|
|
int UI_GetStringWidth(const char *str); // XXX temp
|
|
|
|
void UI_DrawString(float x, float y, const char *str); // XXX temp
|
2011-02-13 14:16:36 +00:00
|
|
|
void UI_DrawTriIcon(float x, float y, char dir);
|
Holiday coding log :)
Nice formatted version (pictures soon):
http://wiki.blender.org/index.php/Dev:Ref/Release_Notes/2.66/Usability
Short list of main changes:
- Transparent region option (over main region), added code to blend in/out such panels.
- Min size window now 640 x 480
- Fixed DPI for ui - lots of cleanup and changes everywhere. Icon image need correct size still, layer-in-use icon needs remake.
- Macbook retina support, use command line --no-native-pixels to disable it
- Timeline Marker label was drawing wrong
- Trackpad and magic mouse: supports zoom (hold ctrl)
- Fix for splash position: removed ghost function and made window size update after creation immediate
- Fast undo buffer save now adds UI as well. Could be checked for regular file save even...
Quit.blend and temp file saving use this now.
- Dixed filename in window on reading quit.blend or temp saves, and they now add a warning in window title: "(Recovered)"
- New Userpref option "Keep Session" - this always saves quit.blend, and loads on start.
This allows keeping UI and data without actual saves, until you actually save.
When you load startup.blend and quit, it recognises the quit.blend as a startup (no file name in header)
- Added 3D view copy/paste buffers (selected objects). Shortcuts ctrl-c, ctrl-v (OSX, cmd-c, cmd-v).
Coded partial file saving for it. Could be used for other purposes. Todo: use OS clipboards.
- User preferences (themes, keymaps, user settings) now can be saved as a separate file.
Old option is called "Save Startup File" the new one "Save User Settings".
To visualise this difference, the 'save startup file' button has been removed from user preferences window. That option is available as CTRL+U and in File menu still.
- OSX: fixed bug that stopped giving mouse events outside window.
This also fixes "Continuous Grab" for OSX. (error since 2009)
2012-12-12 18:58:11 +00:00
|
|
|
|
|
|
|
uiStyle *UI_GetStyle(void); /* use for fonts etc */
|
|
|
|
uiStyle *UI_GetStyleDraw(void); /* DPI scaled settings for drawing */
|
|
|
|
|
2011-02-13 14:16:36 +00:00
|
|
|
/* linker workaround ack! */
|
|
|
|
void UI_template_fix_linking(void);
|
2009-04-09 18:11:18 +00:00
|
|
|
|
2011-10-23 04:13:56 +00:00
|
|
|
/* UI_OT_editsource helpers */
|
2013-04-04 02:05:11 +00:00
|
|
|
bool UI_editsource_enable_check(void);
|
2011-10-23 04:13:56 +00:00
|
|
|
void UI_editsource_active_but_test(uiBut *but);
|
|
|
|
|
2014-02-08 09:03:25 +11:00
|
|
|
/* UI_butstore_ helpers */
|
|
|
|
typedef struct uiButStore uiButStore;
|
|
|
|
typedef struct uiButStoreElem uiButStoreElem;
|
|
|
|
|
|
|
|
uiButStore *UI_butstore_create(uiBlock *block);
|
|
|
|
void UI_butstore_clear(uiBlock *block);
|
|
|
|
void UI_butstore_update(uiBlock *block);
|
|
|
|
void UI_butstore_free(uiBlock *block, uiButStore *bs);
|
|
|
|
bool UI_butstore_is_valid(uiButStore *bs);
|
|
|
|
bool UI_butstore_is_registered(uiBlock *block, uiBut *but);
|
|
|
|
void UI_butstore_register(uiButStore *bs_handle, uiBut **but_p);
|
|
|
|
void UI_butstore_unregister(uiButStore *bs_handle, uiBut **but_p);
|
|
|
|
|
|
|
|
|
2014-01-08 17:04:10 +01:00
|
|
|
/* Float precision helpers */
|
|
|
|
#define UI_PRECISION_FLOAT_MAX 7
|
|
|
|
|
|
|
|
int uiFloatPrecisionCalc(int prec, double value);
|
|
|
|
|
2013-03-23 16:03:13 +00:00
|
|
|
#endif /* __UI_INTERFACE_H__ */
|