From the course: TouchDesigner & Unreal: Interactive Controllers

Welcome

From the course: TouchDesigner & Unreal: Interactive Controllers

Start my 1-month free trial

Welcome

- [Instructor] In this course we're going to learn how to use interactive hardware controllers to drive real-time 3D scenes, in both TouchDesigner and Unreal Engine. We'll start off in TouchDesigner where we'll look at a MIDI controller with sliders that we'll use to drive an effects system that we're going to have create effects after a real-time rendering system. Then we're going to move on to OSC, where we're going to use TouchOSC to be able to build a custom UI that we transfer to an iPad, that sends OSC messages that TouchDesigner can receive. Then we'll learn how to use those messages to move particles and objects around on screen. Then we'll wrap up our touch section with looking at how to get data off a Kinect sensor in a TouchDesigner to be able to control that same particle and object movement. Then we'll switch over to Unreal Engine, we'll start off with MIDI and we'll look at how we build a blueprint in Unreal Engine, to get MIDI flowing into our system. And then we're going to show you how we can build a post process material that's controlled from that MIDI data to in real-time change the stylization of our render. Then we'll move on to OSC and we'll use an iPad running TouchOSC to send OSC data into Unreal, and then we'll learn how to control objects in real-time, in Unreal, from this wireless hardware controller. And then what we're going to do finally with Unreal, is we're going to get the data off a Kinect sensor into Unreal, and learn how we can control object and particle movement, in Unreal, from the data flowing in off a Kinect, as we move around a room. So these are two fantastic real-time 3D graphics systems that have different strengths for different projects. And we're going to learn how we can use real-time controllers to drive some great results in both of them.

Contents