logo Sign In

Color matching and prediction: color correction tool v1.3 released! — Page 35

Author
Time

Sad for Windows users. Hopefully a Windows user with a 64 bit Matlab would come by…soon.

And I have another problem that I couldn’t think of a solution, even though I match using a montage, still some time there’re a couple of color that were off, but when I cropped and only match those problem areas, the color did matched. How do I fix these?

Author
Time

Here you go everybody:

Dr Dre’s color Matching tool v1.2 (latest version) for 64 bit Windows*:

https://ln.sync.com/dl/e51125f10/njry8grb-juw5i7vz-ttrapwci-cb4cxh9j (385 MB)

Dr Dre’s color Matching tool v1.2 (latest version) for 64 bit OSX:

https://ln.sync.com/dl/89689baa0/6c22akdh-3ct3f4wr-mwmscjkw-5q7xg6hb (10 MB)**

* If you have a 64 bit version of Windows AND you have more than 4 GB of RAM AND you need to color match two giant images (4k or 8k with 16-bit color, or 2 giant 8K size montages of images, then it would be worth uninstalling the 32-bit version and downloading and installing the 64-bit version. I don’t think you will find it is any faster under “normal use” situations.

** Why is the mac version so much smaller? Simple really: in 2014, Matlab enhanced it’s package builder with some new features, one of which is an option to download the runtime package from the internet during install. The windows version was compiled with the 2012 compiler, which doesn’t have this option (or the option to change the icon and splash screen).

Enjoy!

If you have any issues with the new versions, let us know.

TheStarWarsTrilogy.com.
The007Dossier.com.
Donations always welcome: Paypal | Bitcoin: bc1qzr9ejyfpzm9ea2dglfegxzt59tys3uwmj26ytj

Author
Time

Wow I was just starting to roll out my plan for the Dragon Ball Z color correction project like 12 hours ago in the kanzenshuu forum and this bomb DROPS. Huge thanks!

The program extracted and installed successfully, only problem is that it didn’t delete the MCRInstaller.exe file after the installation.

Tested on two 8k montage matching, works without a problem, RAM usage peak at 3.1G

Author
Time

thanks a bunch, williarob!

You really only need to hang mean bastards,
but mean bastards you need to hang.

John ‘The Hangman’ Ruth

Author
Time

Thank you so much for the Mac version! 😃

What can you get a Wookiee for (Life Day) Christmas when he already owns a comb?

Author
Time

Bug report, don’t know why it’s happening, but when I match a source, each time the progress bar progresses, the Figure 1 window also shifts over a couple of pixels to the right. This never occured with the 32bit version, thus I suppose it’s an issue either due to a change in Version of the compiler, or due to it being 64bit now.

Author
Time

Thanks Williarob! You’re the best! 😃

Author
Time

Just wondering, is there ever such a thing as setting the “color spaces” value too high? Like if I’m working with DVD footage, would a LUT that was generated with 100 color spaces potentially causes some glitching in the video (because DVD source video may not have that many color spaces to work with)? And if this is true, then is there some kind of standard cut-off for color spaces that can be used for DVD video vs. Blu-ray video vs. 4k film scans?

Author
Time

In principle more spaces will result in a better color match, but it’s possible if artifacts appear early on, they may get worse with each iteration. In that case increasing the smoothing parameter will prevent such artifacts.

Author
Time

DrDre said:

In principle more spaces will result in a better color match, but it’s possible if artifacts appear early on, they may get worse with each iteration. In that case increasing the smoothing parameter will prevent such artifacts.

Here’s the thing. I actually set it at 100 color spaces and the smoothing parameter to .98 (yes you saw that correctly haha). In the program itself, I’m satisfied with how the frame looks after the color correction model is built. But when I’m actually inspecting the video after the LUT has been applied, there are sometimes glitchy areas whenever there’s something shiny in the video. I’ll show you some pictures so you see what I mean. I’ll zoom in on the areas of interest in the frames.

Uncorrected:
Uncorrected

Corrected with LUT 1 (better LUT overall, but causes glitchy spots, as you can see in the shiny areas):
Corrected with LUT 1

Corrected with LUT 2 (not as good as LUT 1, but doesn’t glitch out the shiny areas):
Corrected with LUT 2

One workaround that sometimes works is to re-build the color correction model & LUT but with fast processing. However, this sometimes makes After Effects not even accept the LUT, so it doesn’t always work. What I’m wondering, though, is: why does this even happen, and what can I do to prevent it? (I’m making sure that my test & reference frames are as closely cropped as possible).

Author
Time

One thing is, that the frames with the shiny objects should be used for building the model, otherwise the model will have to estimate the results for the shiny objects, which may lead to artifacts.

Author
Time

Hmm good idea! I’ll try that. Thanks for the help and for making the program to begin with!

Author
Time

This glitching in the highlight areas has always been a problem, and is a big reason I never used this method in in my project. The artifacts look similar to artifacts generated by Photoshop’s Hue/Saturation tool, where shifting the hue of a particular color runs into problems when it encounters high luminosity or low saturation areas. These areas are shifted to the same extent as saturated midtones, leading to blocks of darker or lighter colors in almost colorless walls or skies. I don’t know if this is what the problem is in this case, but perhaps the program would benefit from an algorithm which takes this into account and shifts color less and less as the luminosity approaches 255 or 0 and the saturation approaches 0.

You probably don’t recognize me because of the red arm.
Episode 9 Rewrite, The Starlight Project (Released!) and ANH Technicolor Project (Released!)

Author
Time
 (Edited)

It isn’t really a problem, if you don’t use the algo to adjust colors you didn’t train it to in the first place. The algo shifts high/low saturation areas to how they appear in the reference frame. This is what it is designed to do. Artifacts are usually caused by extrapolation of the curves adjustment to colors it wasn’t properly trained to deal with. If you train the algorithm on those highlights, it will work just fine, and you have the smoothing option to further smooth the gradients where necessary. Like any method it takes experience to use it correctly. If you expect you can use it like some magic wand with zero effort, you’re going to run into trouble. I’ve rarely come across a source and reference that couldn’t be accurately matched with zero artifacts by carefully tuning the parameters.

I would advice everyone who experiences artifacts or other problems, to post the source and reference frames in this thread. I will then create a correction model, which will hopefully have none of these issues, and tell you exactly how I did it.

Author
Time

In very dynamic scenes, the color matching needs to be done frame by frame such as when lightning strikes or police sirens are running and messes with all of the colors in adjacent frames. Also I am a perfectionist and I notice when it isn’t exactly perfect.

This is another reason to explain for my request to have the program work with four folders (cropped source color frames, cropped target color frames, full sized source frames to correct, full sized corrected frames) and automatically color match each numbered frame (they can all have the same filename in each folder since they are in different folders).

I already have the folders set up, and use Photoshop to script out the aligning and cropping of each source and target color frame to folders, but when using the color matching tool I have to load each frame and go through the process one at a time and babysit it and each frame takes a long time when I increase the color spaces to the maximum for the most accurate results. I would love to be able to set up the folders and have it run on its own.

Thanks again for your consideration.

Author
Time

+1 for the batch color matching, I’m wrote a script to generate montages for a video, and there’re about 350 of them per episode. I don’t want to sit there doing nothing but manually matching them one by one.

Author
Time
 (Edited)

DrDre said:

It isn’t really a problem, if you don’t use the algo to adjust colors you didn’t train it to in the first place. The algo shifts high/low saturation areas to how they appear in the reference frame. This is what it is designed to do. Artifacts are usually caused by extrapolation of the curves adjustment to colors it wasn’t properly trained to deal with. If you train the algorithm on those highlights, it will work just fine, and you have the smoothing option to further smooth the gradients where necessary. Like any method it takes experience to use it correctly. If you expect you can use it like some magic wand with zero effort, you’re going to run into trouble. I’ve rarely come across a source and reference that couldn’t be accurately matched with zero artifacts by carefully tuning the parameters.

I would advice everyone who experiences artifacts or other problems, to post the source and reference frames in this thread. I will then create a correction model, which will hopefully have none of these issues, and tell you exactly how I did it.

Interesting. I found I could kill 99% of the highlight glitches from the algorithm by putting a white block in both source and target images. I had reasoned that it was somehow overblowing out the highlights and clipping into black or something like that, so giving a white block let it know that in both images keep the whites white. (since white matches in both images, don’t let the correction exceed and clip) 😃

Author
Time
 (Edited)

Can colormatch be used in the following way to convert HDR to SDR (sort of)-

Play back HDR video on media player such as mpc-hc, it shows faded colors, take a screenshot.
Play back same video on mpc-hc with Madvr as the renderer which reads the HDR and gives decent colors for SDR, take another screenshot. Use those two screenshots in colormatch?

Author
Time

I don’t follow. Could you post the two screenshots?

Author
Time
 (Edited)

Here are the images - https://imgur.com/a/miTtl

Top image is from player that can read the HDR info and give a watchable SDR image.
Bottom is from player that can’t read the HDR info.

HDR videos have different color space than SDR videos. I’m sure you know a lot more that I do about these. BT.709 and BT.2020 but does it really matter in this case of just using colormatch to do it’s magic?

Author
Time

tyee said:

Here are the images - https://imgur.com/a/miTtl

Top image is from player that can read the HDR info and give a watchable SDR image.
Bottom is from player that can’t read the HDR info.

HDR videos have different color space than SDR videos. I’m sure you know a lot more that I do about these. BT.709 and BT.2020 but does it really matter in this case of just using colormatch to do it’s magic?

No, you can use one screenshot as a reference to correct the other, as long as you watch it in SDR space.