logo Sign In

DrDre

User Group
Members
Join date
16-Mar-2015
Last activity
6-Sep-2024
Posts
3,989

Post History

Post
#1084299
Topic
Automated color grading and color matching with a Machine Learning Algorithm
Time

Ladies and gentlemen, I give you the TechBot. Rather than train the model on NeverarGreat’s amazing regrades, I’ve now trained the algorithm on the calibrated Technicolor print scans. Here are some results.

Bluray:

Bluray regraded by the TechBot:

Bluray:

Bluray regraded by the TechBot:

Post
#1084018
Topic
Automated color grading and color matching with a Machine Learning Algorithm
Time

towne32 said:

Well those shots look great. Has there been any shot you’ve tried so far, where things go really wrong?

Not really…they’re all clear improvements over the bluray, even if some might require a little tweaking. These were created with just eight of Neverar’s regrade examples. The great thing is, that depending on your tastes you can create a HarmyBot, kk650-Bot, DreamasterBot, SwazzyBot, or perhaps even a Towne32Bot 😉. Alternatively, you could take some Technicolor print scans, or another official color grading, and train the algorithm to reproduce those colors.

Post
#1084007
Topic
Automated color grading and color matching with a Machine Learning Algorithm
Time

towne32 said:

DrDre said:

Depending on a number of parameters it will take a few minutes to much longer. These were all done in a few minutes.

So, would it be possible at some point to have the bot crawl through all of the shots in a scene (or film, or selections of scenes you deem it fit to regrade) and make the correction it’s trained to make. Then, produce LUTs based on the changes made (assuming LUTs could vary slightly between shots in the same scene), and apply that to the shot as a whole? Just trying to think of how a whole film would be graded with a few minutes per frame. Cloud computing/supercomputers of course would get through it quickly.

Exactly! For scenes for which it doesn’t have a reference, it will make a best guess, based on the preferences it has learned. As we’ve seen earlier, this often gives good results, although in some cases it may need additional references, if a scene was graded very differently on the bluray, compared to your own preference.

Post
#1084005
Topic
Automated color grading and color matching with a Machine Learning Algorithm
Time

CatBus said:

Sooo let me get this straight. Wait, let me step back. This seems awesome. Okay, that’s done. But I’m not sure what this is exactly.

What exactly do you feed the bot at the beginning? The corrected frames plus equivalent original frames, right? Any guidelines, like make sure the primaries are well-represented? Number of frames? Mix interior/exterior lighting?

Yes, you use the original and uncorrected frames to feed to the model. The more frames you have, the better the results in general.

Then step 2 machine learning happens, which I’m perfectly happy to leave at: ???

Then for step 3, here’s where this crap always falls apart for me. It seems to produce, what’s the word… damn good results. Are there exceptions? Does it fall down on filtered shots? Grain, dirt, damage, misalignment? And if it doesn’t, umm, I dunno, I was just wondering when we could take a turn with this shiny new toy of yours?

One of the major challenges is getting the algorithm to recognize different scenes from NeverarGreat’s color timing. Overall he removed the blue cast from the bluray, but there are exceptions. For example for the Death Star conference room scene the walls appear more blue in his regrade.

Bluray:

NeverarGreat regarde:

So, if I let the algorithm regrade a shot from the same scene, I want to see a very similar color timing with the same walls, same skin tones, same color clothes etc, without having to tell the algorithm which scene it’s regrading.

Bluray:

Bluray regraded by NeverarBot:

Bluray:

Bluray regraded by NeverarBot:

Another example is the Leia shot aboard the Tantive IV, where NeverarGreat added a greenish hue to the walls.

Bluray:

NeverarGreat regarde:

Again, if I let the algorithm regrade a shot from this scene, I want to see a very similar color timing, with the same greenish walls, Leia’s red lips, etc.

Bluray:

Bluray regraded by NeverarBot:

Bluray:

Bluray regraded by NeverarBot:

Post
#1083699
Topic
Automated color grading and color matching with a Machine Learning Algorithm
Time

dahmage said:

Wouldn’t this need some sort of feedback loop / evaluation in order to ‘learn’? or maybe that is a distinction between machine learning and neural networks? But it just seems common sense to me that this thing will only do a good job on color matching for new unknown frames if it has some way to evaluate the choices it made. (but then again, sometimes common sense is wrong about these things).

I’m not much of a fan of neural networks personally. Keep it simple and stupid, is my credo. The more complicated you make these things, the better it works for some situations, while screwing up stuff, that should be straightforward.

Post
#1083692
Topic
Automated color grading and color matching with a Machine Learning Algorithm
Time

Depending on a number of parameters it will take a few minutes to much longer. These were all done in a few minutes.

I could do a lightsaber shot, but since NeverarGreat rotoscoped some of the lightsaber effects, I suspect it won’t be able to reproduce those effects. It’s a color grading tool, not a restoration tool.

Post
#1083659
Topic
Automated color grading and color matching with a Machine Learning Algorithm
Time

towne32 said:

Looks great. Neverar can retire early.

As with much of your software, Dre, when I look at this I think “oh, this has commercial potential, why is he wasting his skills on us?”. Which got me thinking, what are the potential legal ramifications of machine learning algorithms like this? If Neverargreat was a commercial entity, and you built an algorithm that used examples of his to learn from, but didn’t actually include any of his work in the software, does he still have any stake in what that ‘bot’ produces? Presumably not, in cases where an AI is trained from the work of numerous artists. But if you really are training a bot to be a Neverar that doesn’t need food/water/sleep, how does that work out?

Well, in a way the algorithm is attempting to distill some of Neverar’s preferred colors from the reference frames. I suspect you cannot simply use someone else’s work as reference in a commercial setting, as you are then profiting from his hard work. It’s a smart algorithm (if I say so myself), but garbage in is garbage out. If it wasn’t for Neverar’s amazing talent, the algorithm would have nothing to show for.

Post
#1083656
Topic
Automated color grading and color matching with a Machine Learning Algorithm
Time

Here are seven more regrades by the NeverarBot.

Bluray:

Bluray regraded by the NeverarBot:

Bluray:

Bluray regraded by the NeverarBot:

Bluray:

Bluray regraded by the NeverarBot:

Bluray:

Bluray regraded by the NeverarBot:

Bluray:

Bluray regraded by the NeverarBot:

Bluray:

Bluray regraded by the NeverarBot:

Bluray:

Bluray regraded by the NeverarBot:

Post
#1083539
Topic
Automated color grading and color matching with a Machine Learning Algorithm
Time

NeverarGreat said:

The biggest challenge is just keeping the tones consistent across the entire film, while at the same time avoiding the ‘samey’ look of an overagressive blanket color grade. It’s a fine line, and from the example frames above it looks like this monotone look could be a problem when using this software. With the 3 frames that Dre chose, you can see the variation in the skin tones from location to location. Yet with the two examples, the color of Obi-wan’s face looks the same in the desert as it does in the Death Star.

In reality you would use at least 10 reference frames in varying conditions to train the algorithm. So, if properly trained, the algorithm should not have a problem with skin tones under varrying lighting conditions.

Post
#1083482
Topic
Automated color grading and color matching with a Machine Learning Algorithm
Time

Deloreanhunter12 said:

I’m curious what would happen is you took the Lord of the Rings colors and placing them on The Hobbit’s colors. Would the color palette and contrast change or would it just be the color palette. Idk what I’m talking about at this point, I’m just interested to see what would happen.

The algorithm needs to be trained first on some references to learn how you would like to see the film graded. If you have 35mm frames for example, you can use a couple of them as references to train the algorithm, that you would like the film to closely resemble the 35mm print, even for shots where you don’t have references.

For someone doing manual color grading, the algorithm could be used to provide a better starting point for a final color grading. You first teach the algorithm how to color grade like you, based on a number of manual color adjustments you provide. You subsequently let the algorithm color grade the other shots, based on what it learned from your work. Once the algorithm is finished color grading rest of the film, you can make final adjustments as you see fit.

Post
#1083422
Topic
Automated color grading and color matching with a Machine Learning Algorithm
Time

Stotchy said:

DrDre said:

I will answer your question only once, if you don’t mind…😉 No, it’s a bit more complicated than that. This is still a simple version based on three frames, but ultimately the model should be able to recognize different types of scenes, and grade them differently, if it is trained to do so.

Firstly - you’re a genius! You continue to amaze me the way you innovate to automate this process.

In theory, using your machine learning algorithm you could take any set of sources and apply it to quickly reproduce that version of the film to the bluray (or any other video source)?

For example, if you wanted to create a GOUT timed version of the bluray you could apply GOUT frames and it would reproduce that timing? (Just using as an example, not a big fan of the GOUT timing)

Also, is it a case that the more frames you add as references the more accurately the algorithm can reproduce that source? E.g. using more NeverarGreat frames as references would more accurately reproduce his colour timing through the whole film?

Finally, would (in theory) this work for contrast and brightness?

Love your work!

Thanks! 😃 Yes, in principle you could recreate a GOUT timing by using a set of GOUT timed references, and then have the algorithm reproduce that timing on the rest of the film. Like you said, the more references you have, the better the results are likely to be. It would also reproduce the contrast and brightness adjustments.

Post
#1083327
Topic
4K restoration on Star Wars
Time

crissrudd4554 said:

Ok here’s how it goes. From the concensus I’ve seen it appears that a good majority of the Star Wars fans who own the current BD sets are fine with them as is. So a simple 4K restoration of the same SE is likely not gonna be enough of a good selling point to sell the same version again. So what do they do?? Restore the original cuts to be included WITH the 4K restored SE.

Seriously guys. Quit being morons. Theoretically speaking what sounds like it’ll profit more even if by a fraction?? A 4K release with only the SE or a 4K release with both cuts?? One cut caters to only a specific market while more than one cut caters to multiple markets.

Case and point. How many people do you actually suspect petitioned to have all those cuts of Blade Runner released together?? Compared to Star Wars probably little to nothing. Did it stop Ridley Scott from remastering those cuts and releasing them together?? Nope. Another example is how a few years ago Kino released a cut of the silent film Metropolis with restored footage not seen since the original release. Compared to Star Wars who do you think really gave a crap about that?? Nonetheless did it prevent them from restoring and releasing it?? Don’t think so.

I’m honestly puzzled why some of you even ask why Disney would spend the money to restore the originals while at the same time neglecting to realize that a restored OUT is likely gonna be packaged as an extra to the SE anyways. However as long as the originals are presented in proper quality I’m sure the majority of us would be fine with it.

Until Lucasfilm/Disney gives a direct response on the matter the speculation resumes as far as where I stand. And no the Kathleen Kennedy interview does not count.

Are YOU serious? Most Star Wars fans have bought pretty every release of Star Wars since the first home video version was released in the early 1980s. All they needed to do, was claim it’s been remastered, or add a few extras. So, given that reality, what makes you believe this trend will suddenly stop with a 4K release? If they release a 4K version with a few new extras, the damn thing will sell like crazy, like all Star Wars releases have, with or without the OOT.

Post
#1083279
Topic
Automated color grading and color matching with a Machine Learning Algorithm
Time

I will answer your question only once, if you don’t mind…😉 No, it’s a bit more complicated than that. This is still a simple version based on three frames, but ultimately the model should be able to recognize different types of scenes, and grade them differently, if it is trained to do so.

Post
#1083273
Topic
Automated color grading and color matching with a Machine Learning Algorithm
Time

How many of you doing manual color corrections have ever asked the question, I wish I could speed up this process? What do you do when you only have color references for part of the film you are correcting? How many of you wish you could do color corrections like many of the talented folk here on these boards, but lack the talent or the desire to spend time doing it?

While working on my little Technicolor scanning project, I realized I’m missing a complete reel of references. Since I don’t have the talent or the patience to do all the manual labour, I decided to do, what I do best: build an algorithm to do it for me. I’ve recently been preoccupied with Machine Learning in my normal line of work, and this gave me an idea:

What if I could train an algorithm to do what another professional, or fan has done using a number of reference frames, and then repeat the process on a completely different frame from a different scene?

Theoretically it should be possible. In fact, it is! 😃

As proof here’s a Star Wars bluray regrade with a self-learning algorithm, based on eight example regrades for different scenes (not including this one) provided by NeverarGreat on his thread:

Comparison:

http://screenshotcomparison.com/comparison/212890

Here are a couple more regraded shots by the self-learning algorithm, based on references provides by NeverarGreat:

=========================================================================================

Original start of the thread:

As a first test I took three of NeverarGreat’s regrades for training my algorithm:

Next I allowed my algorithm to regrade another frame from the bluray:

Of course I was hoping it would look something like NeverarGreat’s result:

Here’s what the algorithm came up with:

Not bad for a first try…😃

Of course there will be limitations and such, but I think I’ve embarked on a new adventure…