
Giving voice to your characters brings them to life. However, doing it the traditional way can be tedious. Fortunately, the process of making your characters talk can now be automated and it gives pretty good results.
Follow these 5 simple steps and you will instantly have a talking character.
This tutorial assumes you already have prepared and imported a character with the right hierarchy and mouth shapes into Adobe Character Animator.
Step 1: Record
For the character to talk it needs to have something to say. So the first step would be recording your audio. (Raw audio works best for lip-syncing).
What You Will Need:
Microphone
Recording a perfect voice can be a bit of a complex process. It depends on the environment you record in and the equipment you use. And although getting a professional mic will make a great difference you can start with what you have for now. Yes, even if that means using the camera mic. You can read here all about the best practices when recording at home.
Recording Software
Alternative:
If you want to skip the recording step, it is possible to use online text to speech converters to generate an audio sample.
Step 2: Import the Audio File
Now it is time to import the audio file you created earlier into character animator software. Go to File and then click Import (or just use the Ctrl+L shortcut). Browse to the location you saved your recording and select the audio file from there.

Step 3: Add the Audio File to Your Scene
Now when the audio file is in your project, just drag it from the project panel into your scene.

Step 4: Unarm Everything Besides Lip Sync
4a. Select your character in the scene timeline and unarm* all the behaviors at the properties panels. (*unarm = remove the red dots from the behaviors).
Clicking on the red dot near the behavior unarms it. To unarm all the behaviors at once just hold the Ctrl key on your keyboard (or Cmd key on Mac) and mouse click on one of the red dots.

4b. Arm the lip-sync behavior in the property panel. (You should now have a red dot near the lip-sync behavior).
Step 5: Compute the Lip Sync

Go to Timeline at the top menu and click Compute Lip Sync Take from Scene Audio. Wait a few seconds and you are done. Your character should be talking once you click play.
Now What?
Now when you have your character lip-synced, you can record other behaviors (like face movement, drag-able hands, and eye gaze) to make your character more expressive. We will have that covered in the next tutorial. Stay tuned!
How can you apply the same computed lip sync to several different puppets? I have two puppets with the same mouths and just want them to use the same lip sync data that I corrected after letting CH compute it.
Hi, Bob!
Applying the same lipsync on several puppets in Adobe Character Animator is actually very simple. You just need to copy and paste it.
1. Go to the scene timeline.
2. Select the puppet you already created the lipsync for.
3. Click on the little triangle next to it to show all the created recordings.
4. Select the created lipsync take and all the visemes.
5. Click Edit-Copy.
6. Select the New Puppet in the Timeline (You can select several puppets at the same time).
7. Bring the playhead to the same time the other puppets lipsync started from.
8. Edit-Paste.
That’s it, your new puppet will have the lipsync.
Thanks!