bge.logic.setRender(flag) to enable/disable render.
The render pass is enabled by default but it can be disabled with
bge.logic.setRender(False).
Once disabled, the render pass is skipped and a new logic frame starts
immediately. Note that VSync no longer limits the fps when render is off
but the 'Use Frame Rate' option in the Render Properties still does.
To run as many frames as possible, untick the option
This function is useful when you don't need the default render, e.g.
when doing offscreen render to an alternate device than the monitor.
Note that without VSync, you must limit the frame rate by other means.
fbo = bge.render.offScreenCreate(width,height,[,samples=0][,target=bge.render.RAS_OFS_RENDER_BUFFER])
Use this method to create an offscreen buffer of given size, with given MSAA
samples and targetting either a render buffer (bge.render.RAS_OFS_RENDER_BUFFER)
or a texture (bge.render.RAS_OFS_RENDER_TEXTURE). Use the former if you want to
retrieve the frame buffer on the host and the latter if you want to pass the render
to another context (texture are proper OGL object, render buffers aren't)
The object created by this function can only be used as a parameter of the
bge.texture.ImageRender() constructor to send the the render to the FBO rather
than to the frame buffer. This is best suited when you want to create a render
of specific size, or if you need an image with an alpha channel.
bge.texture.<imagetype>.refresh(buffer=None, format="RGBA", ts=-1.0)
Without arg, the refresh method of the image objects is pretty much a no-op, it
simply invalidates the image so that on next texture refresh, the image will
be recalculated.
It is now possible to pass an optional buffer object to transfer the image (and
recalculate it if it was invalid) to an external object. The object must implement
the 'buffer protocol'. The image will be transfered as "RGBA" or "BGRA" pixels
depending on format argument (only those 2 formats are supported) and ts is an
optional timestamp in the image depends on it (e.g. VideoFFmpeg playing a video file).
With this function you don't need anymore to link the image object to a Texture
object to use: the image object is self-sufficient.
bge.texture.ImageRender(scene, camera, fbo=None)
Render to buffer is possible by passing a FBO object (see offScreenCreate).
bge.texture.ImageRender.render()
Allows asynchronous render: call this method to render the scene but without
extracting the pixels yet. The function returns as soon as the render commands
have been send to the GPU. The render will proceed asynchronously in the GPU
while the host can perform other tasks.
To complete the render, you can either call refresh() directly of refresh the texture
to which this object is the source. Asynchronous render is useful to achieve optimal
performance: call render() on frame N and refresh() on frame N+1 to give as much as
time as possible to the GPU to render the frame while the game engine can perform other tasks.
Support negative scale on camera.
Camera scale was previously ignored in the BGE.
It is now injected in the modelview matrix as a vertical or horizontal flip
of the scene (respectively if scaleY<0 and scaleX<0).
Note that the actual value of the scale is not used, only the sign.
This allows to flip the image produced by ImageRender() without any performance
degradation: the flip is integrated in the render itself.
Optimized image transfer from ImageRender to buffer.
Previously, images that were transferred to the host were always going through
buffers in VideoTexture. It is now possible to transfer ImageRender
images to external buffer without intermediate copy (i.e. directly from OGL to buffer)
if the attributes of the ImageRender objects are set as follow:
flip=False, alpha=True, scale=False, depth=False, zbuff=False.
(if you need to flip the image, use camera negative scale)
ImageFFmpeg objects will not refresh properly because the image
file is closed immediately after creation. Therefore refresh()
should have no effect on them.
This was causing problems with ImageMix using ImageFFmpeg as
sources: refreshing the ImageMix object is required to update
the mix but it has the side effect of refreshing the underlying
sources, hence the need to skip refresh on fixed images.
In collaboration with Benoit Bolsee (mainly doing it under his
directions).
Note: FFmpeg lib needs to be compiled with rtsp support for this to
work.
Bug 1/2 of T41004
Was a regression since avg_frame_rate changes.
Didn't find reliable way to get stream duration which will
work with both FFmpeg and Libav so added some freaking black
magic to distinguish one from another.
Patch makes it possible to compile blender with recent ffmpeg
and libav libraries, mainly by getting rid of deprecated API.
Original patch by Campbell Barton with own modifications to
support compilation with older ffmpeg versions.
This patch could break compatibility of FFV1 videos playing
back in older players, mainly because of alpha support changes.
Preserving compatibility with such players became a headache
and think it's high time to get rid of workarounds here.
- PyLong_FromSsize_t --> PyLong_FromLong
- PyLong_AsSsize_t --> PyLong_AsLong
In all places except for those where python api expects PySsize_t (index lookups mainly).
- use PyBool_FromLong in a few areas of the BGE.
- fix incorrect assumption in the BGE that PySequence_Check() means PySequence_Fast_ functions can be used.
- Make FFmpeg initialization called from creator, not from functions
which requires FFmpeg. Makes it easier to follow when initialization
should happen.
- Enable DNxHD codec. It was commented a while ago due to some strange
behavior on some platforms. Re-tested it on Linux and Windows and
it seemd to be working quite nice. Would let it be tested further,
if it wouldn't be stable enough, easy to comment it again.
- Make non-error messages from writeffmpeg.c printed only if ffmpeg
debug argument was passed to blender. Reduces console pollution
with messages which are not useful for general troubleshooting.
Error messages would still be printed to the console.
- Show FFmpeg error message when video stream failed to allocate.
makes it easier to understand what exactly is wrong from Blender
interface, no need to restart blender with FFmpeg debug flag and
check for console messages.
Used custom log callback for this which stores last error message
in static variable. This is not thread safe, but with current
design FFmpeg routines could not be called form several threads
anyway, so think it's fine solution/
This switches some areas of Blender which are related on FFmpeg stuff
from deprecated symbols to currently supported one.
Pretty straightforward changes based on documentation of FFmpeg's
API which symbols should be now used.
This should make Blender compatible with recent FFmpeg 0.11.
Should be no functional changes.
from Jason Wilkins (jwilkins)
only applied some parts:
* const correctness
* moved a variable into a move local scope so it is also inside a #if/endif and does not end up conditionally unused
- spelling - turns out we had tessellation spelt wrong all over.
- use \directive for doxy (not @directive)
- remove BLI_sparsemap.h - was from bmesh merge IIRC but entire file commented and not used.
Revert of "SVN commit: /data/svn/bf-blender [36957]
trunk/blender/source/gameengine/ VideoTexture/VideoFFmpeg.cpp: fix for
ffmpeg linking in BGE ( patch by Jens Verwiebe (jensverwiebe) over IRC)"
Sorry folks, that patch breaks current ffmpeg GIT version.
Good news: it's all handled now automagically by ffmpeg_compat.h in
intern/ffmpeg
so: everything should be fine and dandy for very old and very new versions.
* removed a lot of old cruft code for ancient ffmpeg versions
* made it compile again against latest ffmpeg / libav GIT
(also shouldn't break distro ffmpegs, since those API changes
have been introduced over a year ago. If it nevertheless breaks,
please send me an email)
- Use BGL buffer instead of string for image data.
- Add buffer interface to image source.
- Allow customization of pixel format.
- Add valid property to check if the image data is available.
The image property of all Image source objects will now
return a BGL 'buffer' object. Previously it was returning
a string, which was not working at all with Python 3.1.
The BGL buffer type allows sequence access to bytes and
is directly usable in BGL OpenGL wrapper functions.
The buffer is formated as a 1 dimensional array of bytes
with 4 bytes per pixel in RGBA order.
BGL buffers will also be accepted in the ImageBuff load()
and plot() functions.
It is possible to customize the pixel format by using
the VideoTexture.imageToArray(image, mode) function:
the first argument is a Image source object, the second
optional argument is a format string using the R, G, B,
A, 0 and 1 characters. For example "BGR" means that each
pixel will be 3 bytes, corresponding to the Blue, Green
and Red channel in that order. Use 0 for a fixed hex 00
value, 1 for hex FF. The default mode is "RGBA".
All Image source objects now support the buffer interface
which allows to create memoryview objects for direct access
to the image internal buffer without memory copy. The buffer
format is one dimensional array of bytes with 4 bytes per
pixel in RGBA order. The buffer is writable, which allows
custom modifications of the image data.
v = memoryview(source)
A bug in the Python 3.1 buffer API will cause a crash if
the memoryview object cannot be created. Therefore, you
must always check first that an image data is available
before creating a memoryview object. Use the new valid
attribute for that:
if source.valid:
v = memoryview(source)
...
Note: the BGL buffer object itself does not yet support
the buffer interface.
Note: the valid attribute makes sense only if you use
image source in conjunction with texture object like this:
# refresh texture but keep image data in memory
texture.refresh(False)
if texture.source.valid:
v = memoryview(texture.source)
# process image
...
# invalidate image for next texture refresh
texture.source.refresh()
Limitation: While memoryview objects exist, the image cannot be
resized. Resizing occurs with ImageViewport objects when the
viewport size is changed or with ImageFFmpeg when a new image
is reloaded for example. Any attempt to resize will cause a
runtime error. Delete the memoryview objects is you want to
resize an image source object.
Add optional parameter to VideoTexture.Texture refresh() method
to specify timestamp (in seconds from start of movie) of the frame
to be loaded. This value is passed down to image source and for
VideoFFmpeg source, it is used instead of current time to load
the frame from the video file.
When combined with an audio actuator, it can be used to synchronize
the sound and the image: specify the same video file in the sound
actuator and use the KX_SoundActuator time attribute as timestamp
to refresh: the frame corresponding to the sound will be loaded:
GameLogic.video.refresh(True, soundAct.time)
svn merge https://svn.blender.org/svnroot/bf-blender/trunk/blender -r19820:HEAD
Notes:
* Game and sequencer RNA, and sequencer header are now out of date
a bit after changes in trunk.
* I didn't know how to port these bugfixes, most likely they are
not needed anymore.
* Fix "duplicate strip" always increase the user count for ipo.
* IPO pinning on sequencer strips was lost during Undo.
The multi-thread cache service is activated only on multi-core processors.
It consists in loading, decoding and caching the video frames in a
separate thread. The cache size is 5 decoded frames and 30 raw frames.
Note that the opening of video file/stream/camera is not multi-thread:
you will still experience a delay at the VideoFFmpeg object creation.
Processing of the video frame (resize, loading to texture) is still done
in the main thread. Caching is automatically enabled for video file,
video streaming and video camera.
Video streaming now works correctly: the videos frames are loaded
at the correct rate. Network delays and frequency drifts are automatically
compensated.
Note: an http video source is always treated as a streaming source,
even though the http protocol allows seeking. For the user it means that
he cannot define start/stop range and cannot restart the video except
by reopening the source. Pause/play is however possible.
Video camera is now correctly handled on Linux: it will not slow down the BGE.
A video camera is treated as a streaming source.