This is a written, more detailed version of the Quick Camera tracking tutorial in Blender.
For those, who found the video too fast. 😄😄😄


This is how I tracked the camera on "When you hum the Game of Thrones Theme" AVAV.


If you don't have a camera or just want to practice camera tracking without the hassle of shooting,
you can download the footage I used on the AVAV here:

            But if you want to shoot your own footage, these are the things you need to take note of:

Camera movement
It should be minimal. Which means, no shakes, no major changes in camera angle, 
movement not too fast and no zoom ins and outs.

List down your camera settings
Focal Length
Sensor Size
Frame Rate


Open up Blender and change the layout to Motion Tracking.

Open the footage you want camera tracked.

Change the frame rate in Blender
equal to the frame rate of the raw footage.
(29.25 if you used the raw footage provided above)

This is important because if not set right, 
despite the track appearing fine during tracking,
it messes up the track after solving.

Pick a frame from the raw  footage where the CGI element will be mostly on.

(In the footage provided, pick the last frame)

This is just to ensure that the track is more accurate around the frames that matter. :)

After picking a frame, press "Detect Feature" on the Track panel.

This will automatically plot out track points on the frame.

To increase the track points, go to the Detect Features options below the track panel

and change the Threshold and Distance to lower values.

(In the footage above, I changed the Threshold to 0.150 and the Distance to 70)

If you're satisfied with the number of track points, go back to the Track panel and start the track.

(Since we picked the last frame from the footage above, we track backwards.

Track forwards if you picked the first frame

or if you pick a frame on the middle of the footage, track forwards and backwards from that frame

After the tracking is done, expand the Graph Editor to clearly see the curves representation of each track.

Examine the graph and look for a curve that is very different from the rest,

select it and then delete it by pressing "X" and selecting "Delete Curve".

Examine the graph for more stray curves and delete them.

These curves are easily distinguishable because their difference will be very obvious.

Actually, these are tracks that went way off from the  point they're supposed to be tracking.

That's why they need to be deleted: To avoid giving Blender false information about the track.

After cleaning up the track, head over to the panel at the right side of the footage

and look for the camera and lens settings.

This is the part where we'll need the Sensor Size and the Focal Length of your camera.

If you don't know what it was, you can use the camera presets provided by Blender(see image above)

or try looking for your camera settings through Google.

I can't really tell you how else to look for it unless your using your phone as your camera(which I did):

To determine the Focal Length and Sensor Size of my phone camera, I used an app called Phone Tester.

There are a lot of apps like this one, which tells you all the specs of your phone.

Just look for them and download for free from your app store.

Set the focal length and sensor size equal to the camera's.

You'll only be needing the width for the sensor size.

(For the footage above, the Focal length is 2.94 and the Sensor Width is 3.6)

After setting up the camera settings,  go to the Solve Panel and press "Solve Camera Motion".

This process will be done in a few milliseconds.

It will give you a Solve Error information afterwards.

For best results, the solve error should be less than 0.2.

If the Solve Error is higher than 0.2,

I found that the best work around for it is to play with the settings on the Solve Panel. In short, trial and error.

On the Solve Panel, just set Keyframes A & B to frames that have some significant changes between them.

For example, in the footage used above, between frames 70 and 140,

the camera tilted from a higher angle to a lower angle.

You can also tell Blender to refine some camera settings.

(For the footage used, refining K1 and K2 did the trick)

Don't forget to press the Solve Camera Motion button each time you make changes.

You will see that after making some changes on the Solve panel settings, the solve error will drop from high to low.

Remember to make sure that the solve error is lower than 0.2.

If the solve error is lower than 0.2, you can now set the Orientation,

just select THREE track points on the floor in the footage, go back to the Solve Panel

and under the Orientation option, press "Floor".

To set the scale, select TWO track points, preferably, that are close to each other.

And press "Set Scale".

Select ONE track point and press "Set Origin" to set the center of the 3D view.

And you can also set the X-axis, if you want to.

You can preview the Orientation from the 3D view on the upper corner of the Window.

If you're satisfied with the Orientation, just go back to the Solve Panel,

head over the Scene Setup option and press "Setup Tracking Scene".

Blender will then automatically set things up on the 3D viewport

(Set the raw footage as background image, add track points as empty and applying the camera movement)

and will also automatically create a Compositing Node Setup.

You can then preview and make more changes in the viewport.


That's how I tracked and plan to track my cameras for my AVAV videos. If I need to do it quickly. :)

If you have any questions, don't hesitate to ask them on the comments below!