70D - initial experiments on Dual Pixel RAWΒΆ

                                                      -- a1ex, september 2018

August 25, me:

Just stumbled upon something interesting in the firmware. Not sure if nikfreak or anyone else tried that.

call("lv_daf_mode", 1); // "FACTORY:AB_OUT_LEFT"
call("lv_daf_mode", 2); // "FACTORY:AB_OUT_RIGHT"
call("lv_daf_mode", 0); // guess: default, both "channels"

I'd like to see some DNGs from LiveView (plain 1080p) after calling the above functions, without camera movement between the test shots if possible.

September 9, David Hugh:

Here's the test results! Pretty Γ€hm.. interesting :P.

But almost like you expected, right? https://we.tl/t-BykEItcY9C

First dual pixel raw frames from LiveViewΒΆ

Since wetransfer links are going to expire, I've re-uploaded the files. Let's get them:

InΒ [1]:
%%shell
wget -c -q https://a1ex.magiclantern.fm/bleeding-edge/70D/dual-pixel/wetransfer-32dddf.zip
unzip -o wetransfer-32dddf.zip
Archive:  wetransfer-32dddf.zip
Written using ZipTricks 4.6.0
 extracting: RAW-000.DNG             
 extracting: RAW-001.DNG             
 extracting: RAW-002.DNG             
 extracting: RAW-003.DNG             

BackgroundΒΆ

Two years ago I've started to look into Dual Pixel RAW files from 5D Mark IV. Beware: unfinished experiments.

One of my goals was (and still is) to see how far out-of-focus images can be recovered or refocused, with my own algorithms. That wasn't exactly easy for an image processing n00b like me, but got some promising results on synthetic data (look at [17], [18], [22] and [23] in the above link). The other goal - if results are worth the effort, why not expand the concept to dual pixel raw video?

Oh well, so many interesting things to research, so little time...

Let's open the files!ΒΆ

Yes, the above DNGs were from LiveView, i.e. dual pixel raw video. Just 2 frames for now - Canon engineers called them "FACTORY:AB_OUT_LEFT" and "FACTORY:AB_OUT_RIGHT", whatever that means.

InΒ [2]:
%%shell
dcraw RAW-*.DNG
montage RAW-000.ppm RAW-003.ppm RAW-001.ppm RAW-002.ppm -geometry 50%x50%+5+5 preview.jpg

preview

What the duck?!

Time for some pixel peeping in Octave!ΒΆ

You will need Octave 4.x with 16-bit image support.

First, a quick'n'dirty helper for opening raw images (DNG, CR2, whatever) through dcraw:

InΒ [3]:
function im = read_raw(filename)
    # -E is undocumented; it outputs the raw image as with -D, also including optical black areas
    dcraw = 'dcraw -c -4 -E';
    system(sprintf('%s "%s" > tmp.pgm', dcraw, filename));
    im = double(imread('tmp.pgm'));
    system('rm tmp.pgm');
end

First image was created with default LiveView RAW settings in Canon firmware (i.e. what we use for raw video), second image was "FACTORY:AB_OUT_LEFT", third image -- "FACTORY:AB_OUT_LEFT", and last image -- back to defaults.

InΒ [4]:
A = read_raw("RAW-000.DNG");
B = read_raw("RAW-001.DNG");
C = read_raw("RAW-002.DNG");
D = read_raw("RAW-003.DNG");
InΒ [5]:
figure,imshow([A D])
figure,imshow([B C])

A bit too dark for my taste...

Let's apply some gamma correction (2 looks good enough for our purposes) and also normalize the images (trial and error).

InΒ [6]:
a = sqrt(max(A - 2048, 0));
b = sqrt(max((B - 512) * 16, 0));
c = sqrt(max((C - 512) * 16, 0));
d = sqrt(max(D - 2048, 0));
imshow([a d; b c])

Okay, looks like "LEFT" and "RIGHT" from our codenames just mean "crop from the left" and "crop from the right". I was expecting left sub-pixel and right sub-pixel data. Oh well...

The two crops appear to overlap. They are also slightly misaligned horizontally:

InΒ [7]:
imshow([a(1:100,:); b(1:100,:); c(1:100,:)],[])

Let's find the horizontal offset for the two crops.

InΒ [8]:
%plot -s 800,500

# per-column averages
ma = mean(a);
mb = mean(b);
mc = mean(c);

subplot(211); plot(ma);
subplot(212); plot(mb); hold on; plot(mc);

First cropped image appears to match the main image on the left side, and the second one on the right side. Column averages are 1-D arrays that still have the same property, so let's match these.

InΒ [9]:
%plot -s 800,200
pkg load signal
sel = 500:1700;   # select the section with valid image data in both images (approximate)
[R, lag] = xcov(ma(sel), mb(sel), 1000);  # maximize covariance between a and b
[_, i] = max(R); offb = lag(i)
[R, lag] = xcov(ma(sel), mc(sel), 1000);  # maximize covariance between a and c
[_, i] = max(R); offc = lag(i)

subplot(111)
plot(ma); hold on;
plot(circshift(mb, [0 offb]));
plot(circshift(mc, [0 offc]));
offb = -242
offc =  228
InΒ [10]:
% now let's shift the cropped images to match the main one
bs = circshift(b, [0 offb]);
cs = circshift(c, [0 offc]);

imshow([a(1:100,:); bs(1:100,:); cs(1:100,:)])
InΒ [11]:
imshow(max(bs, cs))

Almost there, not exactly a perfect match, but... you get the idea.

Let's zoom in to see the Bayer pattern:

InΒ [12]:
pkg load image

% octave's imresize is picky with out-of-range values
function im = my_imresize(im, varargin)
    im = double(im);   
    lo = min(im(:));
    hi = max(im(:));
    im = (im - lo) / (hi - lo);
    im = imresize(im, varargin{:});
    im = im * (hi - lo) + lo;
end

imshow(my_imresize([a(1:100,1001:1100) bs(1:100,1001:1100) cs(1:100,1001:1100)], 4, 'nearest'));

Wait, what?!

InΒ [13]:
% red Bayer channel in the main image (it's RGGB)
imshow([a(1:2:end, 1:2:end), b(1:2:end, 1:2:end), c(1:2:end, 1:2:end)])
InΒ [14]:
% green1 in the main image
imshow([a(1:2:end, 2:2:end), b(1:2:end, 2:2:end), c(1:2:end, 2:2:end)])
InΒ [15]:
% green2 in the main image
imshow([a(2:2:end, 1:2:end), b(2:2:end, 1:2:end), c(2:2:end, 1:2:end)])
InΒ [16]:
% blue in the main image
imshow([a(2:2:end, 2:2:end), b(2:2:end, 2:2:end), c(2:2:end, 2:2:end)])
InΒ [17]:
% missing data on odd rows, really?!

imshow([B(2:2:end, 1:2:end) C(2:2:end, 1:2:end);
        B(2:2:end, 2:2:end) C(2:2:end, 2:2:end)])

Yeah, looks like...

Nevermind, let's look closer at the valid half of our data:

InΒ [18]:
imshow([B(1:2:end, 1:2:end) C(1:2:end, 1:2:end);
        B(1:2:end, 2:2:end) C(1:2:end, 2:2:end)])

Looks like we need animated GIF for pixel peeping.

InΒ [19]:
mx = max(b(:));
b1 = b(1:2:end, 1:2:end);
b2 = b(1:2:end, 2:2:end);
c1 = c(1:2:end, 1:2:end);
c2 = c(1:2:end, 2:2:end);

imwrite(b1 / mx, "b1.png");
imwrite(b2 / mx, "b2.png");
imwrite(c1 / mx, "c1.png");
imwrite(c2 / mx, "c2.png");
InΒ [20]:
%%shell
convert -loop 0 -delay 100 b1.png b2.png b.gif
convert -loop 0 -delay 100 c1.png c2.png c.gif

b-gif c-gif

Awesome!

Let's try some more:

InΒ [21]:
imwrite((b1 + b2) / 2 / mx, "b12.png");
imwrite((c1 + c2) / 2 / mx, "c12.png");
InΒ [22]:
%%shell
convert -loop 0 -delay 100 b1.png b12.png b2.png b12.png bb.gif
convert -loop 0 -delay 100 c1.png c12.png c2.png c12.png cc.gif

bb-gif cc-gif

Ta-da!

ConclusionΒΆ

These factory functions do, indeed, enable raw image streams from the left and right half-pixels of the image sensor. They do have some limitations, the output is monochrome, it doesn't cover the entire image, but I think the topic is definitely worth exploring. Next steps:

  • understanding image readout configuration for these half-pixels (e.g. with io_trace)
  • changing it to get a full-frame color image (either A or B)
  • figure out whether one can read out both A and B frames from the same captured image, fast enough for real-time video
  • figure out the lossless compression on this camera to make sure the dual pixel video stream can be saved fast enough to the (overclocked) SD card

Proof of concept complete - dual pixel raw video might be possible!