logo Sign In

Info: New colormatching script — Page 2

Author
Time
 (Edited)

pittrek said:

AntcuFaalb said:

bkev said:

Can we see some before/after shots?

Sure. They're not perfect and do note that I didn't do any levels-correction.

Sample 1

Sample 2

Sample 3

Sample 4

Sample 5

Sample 6

Would you do the whole movie ? I like the colours

The best approach is to do it shot-by-shot rather than frame-by-frame.

Also, I don't have enough experience with, e.g., setting proper black and white points.

Plus, I don't want to compete with DJ/You_Too's forthcoming release. You_Too's color choices are amazing, to say the least.

A picture is worth a thousand words. Post 102 is worth more.

I’m late to the party, but I think this is the best song. Enjoy!

—Teams Jetrell Fo 1, Jetrell Fo 2, and Jetrell Fo 3

Author
Time

AntcuFaalb said:

Plus, I don't want to compete with DJ/You_Too's forthcoming release. You_Too's color choices are amazing, to say the least.

You're not competing! Every unique version is exactly that: unique.

And thanks for liking my color settings. :)

Author
Time

You could do the "Early 80s home video" color timing of the GOUT.

Cranked up luma and all! I'd buy that for a dollar.

What’s the internal temperature of a TaunTaun? Luke warm.

Author
Time

AntcuFaalb said:  Sample 3

This shot looks gorgeous.

G-force, the script is incredible work.  I can't wait to give it a try.

Author
Time
 (Edited)

By the way, it should be mentioned that in some cases I've gotten better results with g-force's version 1.0 of the script than with the 1.1.

The difference is that 1.0 tried to also balance black and white levels to the reference clip used, and in 1.1 they are locked.

G, I hope you're ok with me re-posting 1.0 here:

Function RGBMatch(clip gref, clip bref)

{# Make RGB avg of second clip match first clip - v1.0 - by G-force

 


global gref = gref

global bref = bref

 


scriptclip(bref,"""

 

 

 

gref=gref.ConvertToRGB()

bref=bref.ConvertToRGB()

 


grefr = gref.ShowRed().ConvertToYV12()

grefg = gref.ShowGreen().ConvertToYV12()

grefb = gref.ShowBlue().ConvertToYV12()

brefr = bref.ShowRed().ConvertToYV12()

brefg = bref.ShowGreen().ConvertToYV12()

brefb = bref.ShowBlue().ConvertToYV12()

 


y1r = AverageLuma(grefr)

y1g = AverageLuma(grefg)

y1b = AverageLuma(grefb)

 


y2r = 0#Yplanemin(grefr)

y2g = 0#Yplanemin(grefg)

y2b = 0#Yplanemin(grefb)

 


y3r = 255#Yplanemax(grefr)

y3g = 255#Yplanemax(grefg)

y3b = 255#Yplanemax(grefb)

 


x1r = AverageLuma(brefr)

x1g = AverageLuma(brefg)

x1b = AverageLuma(brefb)

 


x2r = 0#Yplanemin(brefr)

x2g = 0#Yplanemin(brefg)

x2b = 0#Yplanemin(brefb)

 


x3r = 255#Yplanemax(brefr)

x3g = 255#Yplanemax(brefg)

x3b = 255#Yplanemax(brefb)

 


#compute coefficients for y=ax^2+bx+c using min, avg, max

 


ar  = x1r==x2r? 0: x1r==x3r? 0: x2r==x3r? 0:

 \ (y3r-y1r+(x1r-x3r)*(y1r-y2r)/(x1r-x2r))/((x1r-x3r)*(x2r-x3r))

br  = x1r==x2r? 0:

 \ ((y2r-y1r)/(x2r-x1r))-ar*(x1r+x2r)

cr  = y1r-ar*x1r*x1r-br*x1r

 


ag  = x1g==x2g? 0: x1g==x3g? 0: x2g==x3g? 0:

 \ (y3g-y1g+(x1g-x3g)*(y1g-y2g)/(x1g-x2g))/((x1g-x3g)*(x2g-x3g))

bg  = x1g==x2g? 0:

 \ ((y2g-y1g)/(x2g-x1g))-ag*(x1g+x2g)

cg  = y1g-ag*x1g*x1g-bg*x1g

 


ab  = x1b==x2b? 0: x1b==x3b? 0: x2b==x3b? 0:

 \ (y3b-y1b+(x1b-x3b)*(y1b-y2b)/(x1b-x2b))/((x1b-x3b)*(x2b-x3b))

bb  = x1b==x2b? 0:

 \ ((y2b-y1b)/(x2b-x1b))-ab*(x1b+x2b)

cb  = y1b-ab*x1b*x1b-bb*x1b

 


brefr=brefr.MT_lut("x x * "+string(ar)+" * x "+string(br)+" * + "+string(cr)+" +").ConvertToRGB()

brefg=brefg.MT_lut("x x * "+string(ag)+" * x "+string(bg)+" * + "+string(cg)+" +").ConvertToRGB()

brefb=brefb.MT_lut("x x * "+string(ab)+" * x "+string(bb)+" * + "+string(cb)+" +").ConvertToRGB()

 


MergeRGB(brefr,brefg,brefb)

 


ConvertToYV12()

 

 

 

""")

 

 

 

Return(last)}

 

Author
Time

No prob, You_Too, I have also found some instances where if the colors are too different, doing the min/max on each channel can be dangerous. Thinking about making a version where you can have all, none, and everything in-between of the min/max matching. The original version that You_Too posted is safest in the mean time. No matter what, the scripts should be applied on a shot-by shot basis only, and the outputs checked carefully. Otherwise, have fun!

Author
Time

Oh, one more thing AntcuFaalb, the script in the first post has the option to determine the correction from two different sources, and then apply it to a third. So you MAY get a bit more accurate results if you crop the GOUT and the rental copy to match each other first, and then apply the changes to the uncropped GOUT. The version that You_Too posted doesn't have this option, so I'll make an updated version after Christmas that has all the options of both scripts, and maybe a YUV version as well. I don't have my computer with me right now, otherwise I'd get on it sooner.

Author
Time

Excellent results, g-force!

How difficult would it be to modify your script for use in After Effects?

Author
Time

snicker said:

Excellent results, g-force!

How difficult would it be to modify your script for use in After Effects?

I don't know. Really hard if I did it myself, do you have any experience making After Effects scripts?

-G

Author
Time
 (Edited)

None unfortunately. Hopefully somebody will chime in who can adapt it because, other than of a couple of commercial options, I can't find anything like this for After Effects.

I'll do some research into AE scripting after Christmas and see if I can port it. I'm really surprised how well its worked in the posted samples.

Author
Time

I've updated the script in the first post to optionally not perform the min/max matching like in the early version of the script that You_Too posted.

snicker,

we need to try to port your clipping removal to Avisynth. Any chance you could provide a tutorial on how you do it? I know you did already in another thread, but I can't seem to find it, and I remember not quite being able to follow it previously. Any chance you could take another stab at explaining it to us?

-G

Author
Time
 (Edited)

After some more experimenting, I'd like to request an update if it's possible:

Can this plugin be changed so that it only adjusts the mids in the RGB gamma curves, (around 128) and leaves black and white levels alone? That way it might put out a much better result in some cases, and I think it would prevent luma flickering if using it to match two video clips?

EDIT:

I don't know if what I mentioned there would be the cure but you see, the problem I'm having is for example: A bright scene with lots of bright whites, a dark object passes by but never covers the whole screen, yet the output video gets darker when the object passes by.

Maybe it's because this plugin calculates the average luma of the whole image? At least that's what I think it does?

Maybe it should be based on calculating luma not by average but by black and white levels, no matter where in the image they are? Or by mids like I said before.

No idea what would work best, but I'd love to see a fix for this.

Author
Time

Hey You_Too,

any chance you can post or PM me the problem clips, as well as how you are calling the function? I'll take a look and see what can be done.

-G

Author
Time
 (Edited)
snicker,

we need to try to port your clipping removal to Avisynth. Any chance you could provide a tutorial on how you do it? I know you did already in another thread, but I can't seem to find it, and I remember not quite being able to follow it previously. Any chance you could take another stab at explaining it to us?

-G

I'm not sure how you would accomplish this. It's assembled in After Effects and uses an intricate system of layer masks and colour keys and several 3rd party AE plug-ins. You would have to duplicate the functions of these plug-ins to port it to AVISynth.

As for explaining how it works... it's a little convoluted but the basic idea goes like this:

Each colour channel is corrected separately. Below is a simplified version of how the red channel shadow detail is restored (AE layer order).

  • Duplicate source video layer (red channel luma) - clipped shadows keyed
  • Duplicate source video layer (green channel luma) - blend mode 'lighten'
  • Duplicate source video layer (blue channel luma) - blend mode 'lighten'
  • Source video layer (red channel luma)

 

The opacity of the blend channels is set to something like 2%-5%. This is enough to restore editable data (if it exists). There is (a lot) more to it than that but this is the easiest way of describing the basic function.

The individual luma channels are then recombined to form a new composite RGB channel. Further edits can then be made to this new video layer.

The reason for using such low opacity is that if highlight colour clipping is present in any of the blend layers (and it often is) it will be introduced to the source channel and screw with the colour. This can be avoided by keying clipped highlight colours from the blend layers allowing for higher opacities.

Author
Time

I agree that it would be hard to port that to avisynth.

Either way, snicker, you should really make a guide sometime on how to do this. And maybe include some before/after screenshots.

Author
Time

Blend mode lighten in After Affects just takes the higher value of the source layer and the other layer. This is a simple LUTxy in Avisynth. The keying is just a LUT. Some of the details still don't make sense to me, but all of the operations seem quite easy.

Author
Time
 (Edited)

G-force, what I've posted above is a very simplified version of just the initial de-clipping pass. There are 5 passes at shadow clipping in total and each one uses a 'new' RGB source (generated in the previous pass) as its target.

I've used a lot of curves adjustments, and numerous filters and third-party plug-ins, which would all require accurate translation to match up to what I have assembled in After Effects.

The example above is an easy port but it gets a lot more complicated after this step.

 

You_Too, I will at some point post a tutorial if I can figure out a way to present it. I have a full workload at the moment so it won't be happening for a while but I will aim to post some before/after shots.

 

G-gorce, if you have no objections, I could post a few samples in this thread. It might be easier if I post images and then answer questions about how the results are obtained.

Author
Time

An obnoxious bump with a request. You'd think I only just joined this site or something.

Anyone wanna take a stab at applying the colors from the 1982 rental transfer by Retartedted to the GOUT?

A Goon in a Gaggle of 'em

Author
Time

A while ago, g-force asked me to post whatever results I had in this thread, and I've at last gotten round to it.

Using the function, I've applied the colors of the Criterion DVD of Spartacus to the HD DVD by Universal.  The initial results are promising.

Top:  Criterion DVD

Middle:  HD DVD + cc

Bottom: HD DVD

http://i53.tinypic.com/33viupt.png

http://i56.tinypic.com/15c8sj.png

http://i53.tinypic.com/30wxhdx.png

http://i51.tinypic.com/2wqhlpe.png

The two versions were aligned temporally but not spatially.  (I assume this does not matter.)  As yet, I've not tried to go scene-by-scene, but I might try at some point.  (Going shot-by-shot seems as if it would be a great deal of work.)

Author
Time
 (Edited)

^ Not bad! At first I read it wrong and thought you were trying to make the criterion look like the HD, lol. Nice job, how's it look in motion?

OT-DAWT-COM nieghbour and sometime poster (Remember, Tuesday is Soylent Green day!)

Author
Time

Chewtobacca said:

Captain?  What is this the navy?

Well, some people might think it is after looking at my avatar! ;)

A picture is worth a thousand words. Post 102 is worth more.

I’m late to the party, but I think this is the best song. Enjoy!

—Teams Jetrell Fo 1, Jetrell Fo 2, and Jetrell Fo 3

Author
Time

My bad, just got words messed up is all, still early in the morning to me.

OT-DAWT-COM nieghbour and sometime poster (Remember, Tuesday is Soylent Green day!)

Author
Time

AntcuFaalb said:  Well, some people might think it is after looking at my avatar! ;)

Hahaha!  :-)

In answer to Rogue-theX's question, I haven't tried to encode it yet.  I've only just finished a short script syncing the two together.