Pixel Aspect Ratio — Video resolution of 1440×1080 considered a widescreen format 16:9 ?

Camcorders sometimes offer to record video in 1440×1080 besides of full HD 1920×1080 mode.
Both can deliver an outcome of a 16:9 (widescreen) frame aspect ratio.

Confusing is that 1440 divided by 1080 results in: 1.33333333333 which is the classic 4:3 (~1.3333333) format but definitely not 16:9 widescreen (1.7777777777778).

The explanation:

Besides of the Frame Aspect Ratio the pixel aspect ratio is the important thing.

Pixel Aspect Ratio == pixel width (x) in relation to pixel height (y)

Pixels can be square or rectangular: a square pixel has the aspect ratio of 1:1 (aka 1) but a rectangular pixel can be anything else.

In the case of the 1440×1080 resolution we have RECTANGULAR pixels which are wider than high, to be precise, their pixel aspect ratio is 1.333333333 : 1.

Multiply 1440 with 1.3333333 == 1920

Multiply 1080 with 1.0 == 1080

So you get a frame aspect ratio of widescreen 16:9 (1920×1080) — done with rectangular pixels.

Details:

How is the pixel aspect ratio calculated?

You need to know the video resolution (widthVideo by heightVideo) and the frame aspect ratio, defined by width x and height y ( xF:yF ). F is here just a symbol for frame.

The formular is:

(xF * heightVideo) / (yF * widthVideo) == Pixel-Aspect-Ratio X / Pixel-Aspect-Ratio Y

In our example we have a resolution of 1440 by 1080

and

a frame aspect ration of 16:9

(16 * 1080) / (9 * 1440) == 4 / 3 which is : ~1.333333 / 1 which results in a pixel aspect ratio of decimal : ~1.333333

Modern digital world computer screens and HD televisions

On Computer screens and in digital processing pixels are in general SQUARE. Only for conversion scenarios software is able to deal or output rectangular pixels.

Modern HD TV’s use fortunately square pixels and therefore a video that was rendered for a computer screen should display normal on a TV screen. However, if the TV screen is older and/or uses rectangular pixels then your video would either be displayed stretched or squished OR the TV displays it in between black bars on top/bottom or left/right.

Let’s define following general rules:
if an image has 640×480 pixels then those pixels are all square pixel.

IMAGES: if an image has 640×480 pixels then those pixels are all square pixel.

DIGITAL VIDEO:

Full HD 1920×1080 video uses also square pixels.

HDV 1440×1080 video uses rectangular pixels with an ratio of 1.333333.

TV:

Modern digital HDTV 1080p uses square pixels.

Older digital TV (non-HD) uses rectangular pixels.

PAL, NTSC, SECAM: those are ANALOG TV systems and they do not use pixels but analog signals in an interlaced manner.

Glossary: some video editing terms

Pixel Aspect Ratio == pixel width (x) in relation to pixel height (y)

Frame Aspect Ratio is also called Display Aspect Ratio. The term Frame describes in the photography and video world a single picture, i.e. the area of what you see in a picture. A photographer ‘frames’ a motif, i.e. decides how much of the motif will be seen on the picture. Imagine that you hold an empty picture frame in front of you and look through it.

 




Related Posts

Comments (6)

Very insightful article. I still have question tough.

Mainly, I shot video with gopro. In photo + video mode, gopro can only produce 12 MB photo and 1440 x 1080 video simultanously. We can’t choose other video format in this mode and since I shoot aerial photo/video using drone, it’s impossible for me to change mode, etc, on the fly. So I have to face this reality, producing 1440 videos, while what client mostly need is 1080p.

It’s true like you said, 1440 can deliver the outcome of 1080, but I found the videos are NOT as sharp as the original 1440 (when put in 1080 format). Is this relates to the 1:1.3 format? In other way, is there any SHARPNESS LOSS when we produce 1080p video using 1440 originally captured video? (due to 1:1.3 conversion to 1:1)?

I use Premiere Pro, export as H264, Quicktime mov, and photo-jpeg codec, none produce desirable outcome in terms of sharpness.

Thanks.
Fadil

Hey Fadil,

I haven’t had the time to test the sharpness under certain circumstances but here’s my understanding of the GoPro 1440 video setting:
It is actually a
Image Size: 1920 x 1440
Frame Rate: 23.976
Pixel Aspect Ratio: 1.0
(I checked that with a GoPro Hero 4 Black and can’t make any statements for other GoPro cams)

So, in that case the pixel ratio is actually 1.o and not 1.33 like in my article which referred to a different camera format (1440×1080 with rectangular pixels) which simulates a 16:9 display format.

The GoPro format is a real 4:3 format. Since it offers actually 1920 pixel in width you would put it in a Adobe Premiere sequence 1080p but the top and bottom part would be cut off because 1080p is 1920×1080.

So far I don’t see a loss in quality.
Another criteria is your sequence setting, whether it is a 1080p24 or a 1080p30 or …
If you are using a 1080p30 AND you keep the length of the video I guess some interpolation will happen by adding the missing frames (24p to 30p conversion).
An option is to INTERPRETE the footage by selecting the clip, then select MODIFY->INTERPRET FOOTAGE: there you can interpret the footage from 24 to 30 frames (resp. 23.976 to 29.97) which results in a shorter video run time.

If all that conversion actually influences the quality in a visible, noticeable manner, … I would doubt it but in theory yes.

Loss of quality can certainly result from the final encoding process: there are several options that will influence the output, among them the codec and additionally the settings (bitrate, etc.).

To see what actually happens with your footage would mean to make several tests and see what the outcome is. Just play around with certain scenarios and judge the outcome.

Hope this helps somehow,
Frank

Hi Frank,

What a great explanation and insightful reply.

Forgot to tell you that yes, I use gopro hero 4 black and just recently realise its 1920 x 1440, not 1440 x 1080. You’re also correct that some of the top and bottom part cut off.

Regarding the main problem of sharpness loss, I take a note your explanation of codec and bitrate. Will try to do some tests more seriously, though so far I’ve tried several settings. Generally, the original videos from 1440 are very good in terms of quality: sharpness, etc. But downgraded when converted to 1080 like I said. Recently, my agency rejected some of our footages (not all) due to 1754 x 1080 resolution they said, while I export it to 1920 x 1080, and so is what the properties say (1920 x 1080). The problem hasn’t been solved because I haven’t had any reply, but I think there is something that I miss. Maybe you have any theory?

Furthermore, I notice when we record video directly in 1080 mode, the quality is also not as good as 1440.

That’s what I can report at this time.

Again, thank you very much for your kind attention.
Fadil

Hi Frank,

I’ve found the answer (though a bit late to inform you).

The answer is simple: use the GoPro software.

The typical workflow would be: import all the mp4 files from the camera to GoPro Studio. Then fix the fisheye effect and select the clips we would like to use. Convert, then again, we can choose which part of the file we would like to use in production. Then export each footage. Import the footages to Premiere, and edit it as usual.

So, rather than import the files directly to Premiere, we should convert it using GoPro Studio. The quality will be there.

Hope this help other readers too, because I’ve never found this explanation anywhere.

thanks again Frank for the opportunity to discuss it and make posts.

Thanks for the update, Fadil!!!

Thanks for the update, Fadil !!!

Comments are closed.