|
BIFS Tutorial - Part IVInteractivity |
|
Now that we can declare and modify a complex scene, we really need user interaction to create highly attractive content. Interactivity authoring mainly relies on the notions of identifiers, events and routes seen in part II. The only add-on is the use of special nodes which generate events when activated, and nodes used for conditional execution and computational operations.
Most nodes generate events upon reception of events emitted by other nodes (Interpolators, Background2D, all nodes with fields of type exposedField). Some nodes however generate events without recieving events from other node, as we have seen with the TimeSensor node. Here is a list of common 2D nodes generating events:
BIFS Node |
Description |
|
Detects mouse move and button actions and generates position events, time events and state events (over, clicked) |
|
Detects mouse move and translates them into rotation events. The rotation center is the origin of the local coordinate system. |
|
Detecte mouse move and translates them into relative translation events |
|
Detects predefined events outside the scene and execute a list of BIFS commands upon reception. The events can be sent by a keyboard, a mouse, a joystick, etc.. |
|
Detects mouse presence in a virtual rectangle and generates position events, time events and state events (is inside) |
Note 1: all positions and rotations events generated are expressed in the local coordinate system of the shape the sensor is attached to.
Note 2: Depending on the scene tree structures, some interaction events may be ignored - more information on this is given by the VRML standard.
The TouchSensor node being the most commonly used interaction node, let's describe it more in details. The TouchSensor node captures mouse events when the mouse is over a shape with the same parent as the sensor:
Example:
<Transform2D>
<children>
<Transform2D>
<children>
<TouchSensor ... />
<Shape DEF="S1">...</Shape>
</children>
</Transform2D>
<Shape DEF="S2" >...</Shape>
</children>
</Transform2D>In this exmple, the touchSensor will be activated when the mouse is over S1, but won't be when the mouse is over S2.
BIFS Field |
type | event |
Description |
|
SFBool | exposedField | Indicates whether the node is enabled or not |
|
SFVec3F | eventOut | Indicates the current mouse position in the local coordinate system. In 2D, the Z coordinate is set to 0. |
|
SFBool | eventOut | Indicates if the mouse button is pressed or not |
|
SFBool | eventOut | Indicates if the mouse is over the associated shape(s) or not |
|
<Conditional> <buffer> ... list of BIFS commands ...</buffer> </Conditional> |
Example: The following scene shows a conditional node combined with a touchSensor. The conditional adds an object when the rectangle is clicked cond1.xmt, cond1.bt, cond1.mp4
The Valuator node can be used to perform simple linear transformation on events and typecasting (conversion from a field type to another). The node has an input field and an output field for all BIFS base types (SFBool, SFInt32, MFColor ...) except SFNode and MFNode. These fields are named inSFBool/outSFBool, inSFInt32/outSFInt32, inMFColor/outMFColor... It also has 9 fields of types SFFloat called Factor1, Factor2, Factor3, Factor4 and Offset1, Offset2, Offset3, Offset4, plus an SFBool field called Sum.
The input values are transformed to float as follows:
Then for each eventOut field the following formula is applied to all components of the output value: output.i = Factor_i * input_i + Offset.i
input.i : i-th component of input event.
output.i: i-th component of output event.
Factor_i: Factor1, Factor2, Factor3 or Factor4 field depending on the value of i.
Offset_i: Offset1, Offset2, Offset3 or Offset4 field depending on the value of i.In the special case of a scalar input type (e.g. SFBool, SFInt32) that is cast to a vectorial output type (e.g. SFVec2f), for all components i of output.i, input.i takes the value of the scalar input type, after type conversion.
Otherwise, if the output field has more component than the input field (SFVec2f to SFVec3f), the extra output components are set to 0.If the Sum field is TRUE then the sum of all input components are used instead of matching component to component.
The output type-casting is done as follows:
Example: The following scene shows a valuator combine with a touchSensor to change an object color when clicking on a rectangle val1.xmt, val1.bt, val1.mp4
The Script node is used to interface ECMAScript code with the BIFS scene. ECMAScript is the root standard of JavaScript, VRMLScript, ... The script node design is unique in BIFS: it has from 3 to infinity fields, and can thus be used to process any kind of events for complex interactivity. The pre-defined URL field is used to store the script code.
For each eventIn (my_event) defined by the user in the script, an ECMAScript function ( function my_event(value, timestamp) ) is defined in the script, where value is the recieved value and timestamp the current scene time.
Moreover, the initialize function is called, when present, upon creation of the node and the shutdown function is called (when present) at deletion of the node. This allows for very complex authoring as found in most javascript games on the internet.
Example: The following scene shows how to modify a node field from inside a Script node script1.xmt, script1.bt, script1.mp4.
Manipulation of the scene graph through ECMAScript is defined in VRML
Exercise 14 : Rewrite the val1.xmt example to use a conditional instead of a valuator and keep the same behaviour.
In this part we have reviewed the most common interaction tools of the BIFS scene. Although this is far from being exhaustive, you now have enough tools to create very complex and impressive content, from games to presentations and interactive movies.
[ Home ] [ Including Media ] [ Interactivity ] [ Time-dependent Nodes ] |