Loading...

Creating video content with AI tools

Related people and/or projects: Welcome James!

Explore a new video AI tool and consider possible production implications in the future. Join us as we very hesitantly and with some trepidation experiment with an AI video creation tool. 

You can bet that the ETO is keeping an eye on all things AI. One really interesting tool that caught our attention was HeyGen. It's one of many emerging tools that creates videos (after uploading a sample video for training and then using a script). You can imagine that as a shop that makes a lot of videos, we're always curious about ways to decrease production time while increasing the positivity of a learner's experience. A common tension is that the instructors and industry experts that we work with don't have a lot of time.

We're going to go in depth on HeyGen, including a few examples, and share some ideas on how to maximize a tool like this while considering authenticity, privacy and intellectual property. With this article, the ETO is not suggesting that instructors use AI video generators in their courses, but to share awareness of new tool developments and to consider how this might change things in the future.

First, watch this video:

Notice anything amiss? This video is completely generated by HeyGen’s AI. Media Specialist, James, uploaded a script to his AI avatar. To do this experiment, he used HeyGen, which  is a video AI tool that can create a digital twin of yourself (“custom avatar”), which then allows you to generate talking head videos by feeding it a script or a real audio recording of your own.

How does it work?

You can upload a 2-minute of yourself talking, and HeyGen will be able to generate an avatar and a voice clone of yourself. You can then use this to create videos without doing the filming.

With a subscription, you can finetune your avatar so it’s more authentic, create a more accurate voice clone, and with “Studio Avatars,” which is HeyGen's much more involved process to create a custom avatar, you can even swap the outfits and backgrounds. Learn more about all the avatar tiers.

Can it do other languages?

You be the judge:

For the above video, James translated the same script to Cantonese and did the voice recording himself and then uploaded the voice recording to HeyGen. Although Cantonese isn’t officially supported by HeyGen right now, the AI avatar can still lip-sync to the recording. As a Cantonese speaker, James reports that his lip movements here look 100% authentic.

How could the ETO use a tool like this?

This is really new and we have no plans at the moment to use this tool in a project. What we are doing here is just experimenting with a new technology and there's a lot to consider before even piloting a tool like this.

Benefits for Instructors and Industry Experts

  1. Reduce discomfort from being on camera. We often work with many different subject matter experts (SMEs) who have very little experience being on camera. Based on personal preference, this tool could be a huge benefit for those who are uncomfortable being on camera.
  2. Removing geographic barriers. Some SMEs cannot come to the studio to record on location and are limited to recording remotely. Depending on the SME’s comfort level with AV equipment, we might get video files whose quality leaves a lot to be desired. Custom avatar tools can help avoid discrepancies in quality in videos recorded by different SMEs.

Benefits for the ETO

  1. Increasing Efficiency during Video Revisions. If the SME made a critical mistake in the video and it was not caught on the spot, revising that can be a challenge depending on factors like project deadlines, locations, SME’s availability and the ETO team’s availability and workload. Custom avatar can make fixing a mistake like this much easier, as you can just change the script and regenerate the video, or, if real voice recording was used, re-record only the mistakes in the recording, reupload it, and regenerate the video.
  2. Consistency across modules and courses. We try to design a consistent learning experience within a project. This results in elements that are re-used across modules. An example would be a review video at the end of each modules that articulates key points. If we decided to add an element like this in, after filming, we could use this type of tool to create these elements without bringing the SME back in for more filming.
  3. Use for training (and other lower stakes) video content. There could be use cases where video would be helpful, but we don't have the timeline to produce it. A tool like this could add a bit of human (yes, we see the irony) authenticity to a video without the production load.

What are a few tips, if you're thinking about making your own avatar?

From James:

  • I noticed that the quality of the 2-minute video you upload is crucial to creating an avatar that works well with different scripts. I had to re-do this step with different videos of myself before getting a really good result. The key is to not move too much and only stick to generic gestures.
  • Note that for my AI avatar you see in the linked videos above, I just used a segment of a video I filmed in the past (for unrelated purposes) instead of filming a video specifically for this purpose. I think I would get even better results if I did that.
  • If an SME doesn’t have a good camera and/or good lighting to begin with, it might negatively affect the quality of the custom avatar.
  • There is currently no way to finetune the behaviors of your custom avatar on a video-to-video basis. For example, I noticed that in some of the videos HeyGen generated for me (linked above), sometimes my gestures and facial expressions don’t really match what I say at that specific moment. I assume this will get better as the company continues to develop the tool.
  • If you are using your custom avatar with your own voice clone, your voice will take on a very standard American accent; any unique accent of your own will be gone. As far as I know, there’s no way to tweak or change the accent of your voice clone. However, I’m not sure if your voice clone will take on a standard British accent if you speak with a British accent in the 2-minute video you feed it at the beginning.

Can a tool like this be abused?

All tools should be considered carefully before use. To avoid abuse, you are required to film a consent video before HeyGen can generate the custom avatar. It should prevent people from creating avatars of others without their consent, though we do have concerns about how strict the review process is. See HeyGen’s full privacy information page.

 

Article Category: General Information
Tags: AI