logo Sign In

Automated color grading and color matching with a Machine Learning Algorithm — Page 5

Author
Time
 (Edited)

Nothing personal lol I’ve played with it many times and never quite gotten the color I was hoping for.

And truth be told I preferred Neverarbot’s regrade over Neverargreat’s for the shot of Ben, Luke, and 3P0 on the cliff . 😜

Preferred Saga:
1/2: Hal9000
3: L8wrtr
4/5: Adywan
6-9: Hal9000

Author
Time
 (Edited)

Darth Lucas said:

So forgive my technical ignorance, DrDre, but how exactly does this work? Do you have to manually attach a shot to a source frame? Or does it take the bulk of reference frames you have and build a universally applicable model to grade any shot with?
And how is the grading accomplished? Does it just take the color palette of your references and match the source to it as closely as possible?

A detailed explanation of what steps the software takes would go a long way for helping a non-technically minded person like myself to comprehend what’s going on here.

No, you don’t have to manually attach a shot to a source frame. The algorithm just needs a set of source frames and their references. These references are not the original reference frames, but the source frames matched to references by the color matching algorithm, such that the colors of a pixel in the source can be directly mapped to the same pixel in the reference. For any new shot that you want to regrade, the algorithm automatically assigns higher weights to the references, that are more most similar to the new shot. There are no manual steps involved. You just point it to the shots you want to regrade, and presto…

Author
Time

Just found this thread and I’m in awe. This looks really amazing and I’m definetly looking forward to its release.

One question though: Will the algorithm regrade the entire movie at once or do you give it individual scenes/shots that will then be regraded? I’m asking because in the latter case, scene wipes might be a problem (and also, just getting the entire movie regraded at once would be even more convenient^^).

Author
Time

Kexikus said:

Just found this thread and I’m in awe. This looks really amazing and I’m definetly looking forward to its release.

One question though: Will the algorithm regrade the entire movie at once or do you give it individual scenes/shots that will then be regraded? I’m asking because in the latter case, scene wipes might be a problem (and also, just getting the entire movie regraded at once would be even more convenient^^).

The algorithm will regrade a frame for each individual shot generating a LUT, that will then have to be used to regrade the complete shot. Scene wipes will have to be fed to the algorithm individually, but this shouldn’t be a problem.

Author
Time

To add to the compliment pile here, I’d just like to say that algorithm behind this and the results are incredible.

Author
Time
 (Edited)

Here’s a test of the color grading algorithm using the preview of NeverarGreat’s regrade of reel 4. I selected 16 reference shots from the reel as reference shots to train the algorithm to approximate the look and feel of NeverarGreat’s regrading prowess on 16 different shots.

The reference frames I have chosen are:

The test frames from the bluray are the following:

Here are NeverarGreat’s regrades for these shots:

Here are the NeverarBot’s regrades for the same shots:

While there are slight differences here and there, the regrades of the NeverarBot closely approximate NeverarGreat’s manual regrades.

Author
Time

Looks very promising. The Death Star walls still vary in color, but that is expected since my regrade didn’t go for strict adherence to a standard color, opting for a location-by-location color shift instead. The only other significant change I see is that the bot has less saturated skin tones, which makes sense since the bot probably doesn’t prioritize skin tones.

I should note that 5 or 6 of the test frames are from shots that I altered with color from another source, so the bot probably wouldn’t have been able to replicate that look anyway (see the R2 and 3PO shot and the trash compactor shot).

You probably don’t recognize me because of the red arm.
Episode 9 Rewrite, The Starlight Project (Released!) and ANH Technicolor Project (Released!)

Author
Time

Very promising indeed! Nice job you two…um…three.

Author
Time

Is this limited to color, or could you change other attributes? Like running say 5 different shots through PFclean, then letting the learning algorithm try and degrain based on the results.

Or using it to combine different sources- say you have two different sources, but one has more picture on the left and right or top and bottom (Fellowship of the Rings theatrical vs extended)

You manually combine a variety of the frames, adding as much extra picture as possible without causing uneven letterbox borders. You train the algorithm to accept the 2 source images, and a reference image with them manually combined.

Preferred Saga:
1/2: Hal9000
3: L8wrtr
4/5: Adywan
6-9: Hal9000

Author
Time

The algorithm is now specifically designed for color grading. It would need to be completely redesigned for other purposes.

Author
Time
 (Edited)

DrDre said:

The algorithm is now specifically designed for color grading. It would need to be completely redesigned for other purposes.

I think you’re extremely close to nailing it the nerverbot one seems awfully close to perfect

Author
Time
 (Edited)

NeverarGreat said:

Looks very promising. The Death Star walls still vary in color, but that is expected since my regrade didn’t go for strict adherence to a standard color, opting for a location-by-location color shift instead. The only other significant change I see is that the bot has less saturated skin tones, which makes sense since the bot probably doesn’t prioritize skin tones.

I should note that 5 or 6 of the test frames are from shots that I altered with color from another source, so the bot probably wouldn’t have been able to replicate that look anyway (see the R2 and 3PO shot and the trash compactor shot).

It’s not so much a case of not prioritizing skin tones, but adding more references to teach the algorithm how to deal with skin tones. Here’s another set of regrades, where I’ve also used the second set of 16 shots as references. Saturation of the skin tones is now much closer.

Bluray:

Bluray regraded by NeverarGreat:

Bluray regraded by the NeverarBot:

Author
Time
 (Edited)

This is amazing stuff, DrDre. I’ve been looking forward to your Raiders WOWOW regrades for a while; would the bot help to hasten progress on that at all? *

  • no pressure mate. I know you have to prioritize other stuff right now 😉
Author
Time
 (Edited)

in the newest neverbot shots some look better but others look wrong like the actors look reddish in some shots. it need to be in between the nevergreat and the neverbot.

Author
Time

Could this theoretically be used to make skin tones consistent throughout the entire film?

Author
Time

That depends on how consistent the skin tones are in the source you want to regrade. If they are consistent than yes.

Author
Time

is it possible too colorize black and white scenes?

Author
Time

No, you can’t create information, that isn’t there. You can use it to correct scans of faded prints.

Author
Time

If I remember correctly, the 2004 DVD menus used the same video source, prior to color “correction”, which looked better than the actual films.

I would love to see a MenuBot

Preferred Saga:
1/2: Hal9000
3: L8wrtr
4/5: Adywan
6-9: Hal9000

Author
Time

Going off needed quite a few photos for references, in theory, could you just use one large high res photo that has multiple frames in it to get a quality lut out of it?

Author
Time

Deloreanhunter12 said:

Going off needed quite a few photos for references, in theory, could you just use one large high res photo that has multiple frames in it to get a quality lut out of it?

I don’t know if it will work for the automated color grading, but it did for DrDre’s color matching tool. I used that with 9 frames to get a LUT that regrades the AOTC Blu-ray to the DVD colors. This is of course not a perfect solution, but it gives good results.

And I’d assume the same holds for the automated color grading: It should work but won’t be as good as having the algorithm grade single scenes.

Author
Time
 (Edited)

JawsTDS said:

DrDre said:

towne32 said:

Well those shots look great. Has there been any shot you’ve tried so far, where things go really wrong?

Not really…they’re all clear improvements over the bluray, even if some might require a little tweaking. These were created with just eight of Neverar’s regrade examples. The great thing is, that depending on your tastes you can create a HarmyBot, kk650-Bot, DreamasterBot, SwazzyBot, or perhaps even a Towne32Bot 😉. Alternatively, you could take some Technicolor print scans, or another official color grading, and train the algorithm to reproduce those colors.

I eagerly await the day we can create a GOUTbot.

Could this be attempted, for kicks? Might be fun to see how it’d look.
And, Kexiklus, please don’t post any comparison pics of AOTC; it’ll only tempt me to revisit it! 😛

My stance on revising fan edits.

Author
Time

I’m planning to release a color grading tool based on this methodology. The development will take a little time, as I’m preoccupied with a number of other projects, but I will keep you posted…

Author
Time

DrDre said:

I’m planning to release a color grading tool based on this methodology. The development will take a little time, as I’m preoccupied with a number of other projects, but I will keep you posted…

Sounds great. Can’t wait for it

Author
Time
 (Edited)

Here’s a set of regrades by the DrDreBot v1.0:

Bluray:

Bluray regraded: