logo Sign In

lansing

User Group
Members
Join date
2-May-2017
Last activity
21-Feb-2022
Posts
32

Post History

Post
#1096441
Topic
Color matching and prediction: color correction tool v1.3 released!
Time

DrDre said:

lansing said:

Williarob said:

alexp120 said:

alexp120 said:
If you look at williarob’s “Restoring Color to Red Faded Film” video

http://thestarwarstrilogy.com/starwars/post/2016/09/12/Restoring-Color-to-a-Faded-Eastman-Print-of-Star-Wars

…beginning at the 15:47 mark, he explains how you can build a model containing a wide variety of colors. You are grabbing sample frames of the shot that you are working on from both the reference video and the test video, and creating LUT’s of all the colors in that shot.

I noticed that the number of sample frames of the shot from both the reference video and the test video is 16. Why 16?

I think it had to do with how big the montage image was going to be. I didn’t want any blank areas, so it had to be 4x4 or 3x3 or 5x5, but if I recall correctly, any bigger than 4x4 at that resolution (each frame about 848x360? - something like that) would just leave the color match tool hanging indefinitely on my system. Your system may be more or less powerful than mine, so your mileage may vary.

If you were using the tool to grade a film like Raiders or Star Wars then I would recommend you take 8 to 16 frames from every scene (rather than shot) and turn them into montages, one of the source (or test) frames and one of the target (reference) frames and create a LUT for each scene.

In my testing, I did try grabbing up to 64 small frames (smaller than SD) from an entire reel of Star Wars and generating montages and a single LUT for an entire reel, but found it was much less accurate than doing it on a scene by scene basis. However, this was probably because the reel did not have consistent colors (parts of the reel were scanned with different color settings) so 1 LUT to rule them all was not possible until all the shots had been color balanced first.

However you may be able to create a single montage for an entire episode of what you are working on, especially if the color changes are consistent.

This is a very late follow up because I was slacking off. I created a montage reference image and it sure is more accurate. My question now is is 4x4 montage the overall best size as a reference for a scene? Or is it better to build the montage with 1 frame from each second in the scene?

My second question is what is the minimum resolution we can use to create the montage without affecting quality when building the color matching model in the program? It sounds too crazy to build the montage with the full size 1920x1080 frames.

Montages will work, but the more frames the more difficult it is to match the frames, with a higher probability of artifacts. For the fast mode the frame size is significantly reduced.

For making a montage, what is a good size for any individual frame to reduce to? Because it makes sense to reduce a 4x4 montage of 1920x1080 frames, but it doesn’t make sense to reduce the size of a 4x4 montage that is 1920x1080, since each frame inside the montage is very small already.

Post
#1096425
Topic
Color matching and prediction: color correction tool v1.3 released!
Time

Williarob said:

alexp120 said:

alexp120 said:
If you look at williarob’s “Restoring Color to Red Faded Film” video

http://thestarwarstrilogy.com/starwars/post/2016/09/12/Restoring-Color-to-a-Faded-Eastman-Print-of-Star-Wars

…beginning at the 15:47 mark, he explains how you can build a model containing a wide variety of colors. You are grabbing sample frames of the shot that you are working on from both the reference video and the test video, and creating LUT’s of all the colors in that shot.

I noticed that the number of sample frames of the shot from both the reference video and the test video is 16. Why 16?

I think it had to do with how big the montage image was going to be. I didn’t want any blank areas, so it had to be 4x4 or 3x3 or 5x5, but if I recall correctly, any bigger than 4x4 at that resolution (each frame about 848x360? - something like that) would just leave the color match tool hanging indefinitely on my system. Your system may be more or less powerful than mine, so your mileage may vary.

If you were using the tool to grade a film like Raiders or Star Wars then I would recommend you take 8 to 16 frames from every scene (rather than shot) and turn them into montages, one of the source (or test) frames and one of the target (reference) frames and create a LUT for each scene.

In my testing, I did try grabbing up to 64 small frames (smaller than SD) from an entire reel of Star Wars and generating montages and a single LUT for an entire reel, but found it was much less accurate than doing it on a scene by scene basis. However, this was probably because the reel did not have consistent colors (parts of the reel were scanned with different color settings) so 1 LUT to rule them all was not possible until all the shots had been color balanced first.

However you may be able to create a single montage for an entire episode of what you are working on, especially if the color changes are consistent.

This is a very late follow up because I was slacking off. I created a montage reference image and it sure is more accurate. My question now is is 4x4 montage the overall best size as a reference for a scene? Or is it better to build the montage with 1 frame from each second in the scene?

My second question is what is the minimum resolution we can use to create the montage without affecting quality when building the color matching model in the program? It sounds too crazy to build the montage with the full size 1920x1080 frames.

Post
#1075538
Topic
Color matching and prediction: color correction tool v1.3 released!
Time

alexp120 said:

lansing said:

DrDre said:

lansing said:

DrDre said:

lansing said:

DrDre said:

It’s generally much faster to export a LUT for use in Davinci Resolve or Adobe After Effects, than to do it frame by frame in the tool. Shot-by-shot correction still takes a lot of time though, as each LUT is good for roughly 100 frames, and there are roughly 2,000 shots in a two hour film.

Thanks, I imported the generated LUT into Premiere and it applied instantly, with the cost of spike in RAM.

I’m testing it out on my dragon ball now, and I have a panning shot where there’re noticeable ringing artifacts along the lines. I have to turn smooth up to 0.9 in order to get it away. Is this normal? Can you take a look? I uploaded the reference and test image.

reference: http://i.imgur.com/id8AiRx.png
test: http://i.imgur.com/I54GI7O.png

Did you make sure the cropping is the same for the test and reference frames? If so, could you also share the raw test frame?

I tested on 3 other scenes, they all work fine even without cropping. I did try cropping this test image to match the reference, but it’s still the same problem with the white ringing between the character and the sky. I also tried sampling from the last frame of this same scene and it works fine again without the need to pump up the smoothing parameter.

reference: http://i.imgur.com/YOCkH0E.png
test: http://i.imgur.com/sRUrxNw.png

The fact that it works without cropping is fortunate, but the method is not developed specifically assuming the image content is exactly the same, except for the colors. It may work fine for some frames, but give artifacts for others.

The reason you get ringing is, because the ringing is already present in the test frame, a side effect of DNR:

The ringing has rougly the same color as the clouds next to it, both being light blue.

The clouds in the reference image are more white, than in the test frame, while the sky is a darker blue:

By color matching the color of the clouds, you automatically also give the ringing rougly the same color as the clouds, causing them to be more apparent against the darker blue background. Sadly there’s very little that can be done about this.

By increasing the smoothing parameter, you reduce the color gradients, as the algorithm tries to keep the gradients as close to the original gradients as possible, while also trying to find the best color match. This may reduce the ringing somewhat, but will generally also result in a poorer color match.

Thanks for the explanation, so I think the best approach for this is to find a frame in the scene that has the least amount of ringing + covers a wide variety of colors, build a matching model out of it and apply it to the entire scene and hope for the best.

If you look at williarob’s “Restoring Color to Red Faded Film” video

http://thestarwarstrilogy.com/starwars/post/2016/09/12/Restoring-Color-to-a-Faded-Eastman-Print-of-Star-Wars

…beginning at the 15:47 mark, he explains how you can do just that–build a model containing a wide variety of colors. You are grabbing sample frames of the shot that you are working on from both the reference video and the test video, and creating LUT’s of all the colors in that shot.

I will see how effective the montage way goes, but where can I find this “color balancing” too?

Post
#1072635
Topic
Color matching and prediction: color correction tool v1.3 released!
Time

DrDre said:

lansing said:

DrDre said:

lansing said:

DrDre said:

It’s generally much faster to export a LUT for use in Davinci Resolve or Adobe After Effects, than to do it frame by frame in the tool. Shot-by-shot correction still takes a lot of time though, as each LUT is good for roughly 100 frames, and there are roughly 2,000 shots in a two hour film.

Thanks, I imported the generated LUT into Premiere and it applied instantly, with the cost of spike in RAM.

I’m testing it out on my dragon ball now, and I have a panning shot where there’re noticeable ringing artifacts along the lines. I have to turn smooth up to 0.9 in order to get it away. Is this normal? Can you take a look? I uploaded the reference and test image.

reference: http://i.imgur.com/id8AiRx.png
test: http://i.imgur.com/I54GI7O.png

Did you make sure the cropping is the same for the test and reference frames? If so, could you also share the raw test frame?

I tested on 3 other scenes, they all work fine even without cropping. I did try cropping this test image to match the reference, but it’s still the same problem with the white ringing between the character and the sky. I also tried sampling from the last frame of this same scene and it works fine again without the need to pump up the smoothing parameter.

reference: http://i.imgur.com/YOCkH0E.png
test: http://i.imgur.com/sRUrxNw.png

The fact that it works without cropping is fortunate, but the method is not developed specifically assuming the image content is exactly the same, except for the colors. It may work fine for some frames, but give artifacts for others.

The reason you get ringing is, because the ringing is already present in the test frame, a side effect of DNR:

The ringing has rougly the same color as the clouds next to it, both being light blue.

The clouds in the reference image are more white, than in the test frame, while the sky is a darker blue:

By color matching the color of the clouds, you automatically also give the ringing rougly the same color as the clouds, causing them to be more apparent against the darker blue background. Sadly there’s very little that can be done about this.

By increasing the smoothing parameter, you reduce the color gradients, as the algorithm tries to keep the gradients as close to the original gradients as possible, while also trying to find the best color match. This may reduce the ringing somewhat, but will generally also result in a poorer color match.

Thanks for the explanation, so I think the best approach for this is to find a frame in the scene that has the least amount of ringing + covers a wide variety of colors, build a matching model out of it and apply it to the entire scene and hope for the best.

Post
#1072583
Topic
Color matching and prediction: color correction tool v1.3 released!
Time

DrDre said:

lansing said:

DrDre said:

It’s generally much faster to export a LUT for use in Davinci Resolve or Adobe After Effects, than to do it frame by frame in the tool. Shot-by-shot correction still takes a lot of time though, as each LUT is good for roughly 100 frames, and there are roughly 2,000 shots in a two hour film.

Thanks, I imported the generated LUT into Premiere and it applied instantly, with the cost of spike in RAM.

I’m testing it out on my dragon ball now, and I have a panning shot where there’re noticeable ringing artifacts along the lines. I have to turn smooth up to 0.9 in order to get it away. Is this normal? Can you take a look? I uploaded the reference and test image.

reference: http://i.imgur.com/id8AiRx.png
test: http://i.imgur.com/I54GI7O.png

Did you make sure the cropping is the same for the test and reference frames? If so, could you also share the raw test frame?

I tested on 3 other scenes, they all work fine even without cropping. I did try cropping this test image to match the reference, but it’s still the same problem with the white ringing between the character and the sky. I also tried sampling from the last frame of this same scene and it works fine again without the need to pump up the smoothing parameter.

reference: http://i.imgur.com/YOCkH0E.png
test: http://i.imgur.com/sRUrxNw.png

Post
#1072301
Topic
Color matching and prediction: color correction tool v1.3 released!
Time

DrDre said:

It’s generally much faster to export a LUT for use in Davinci Resolve or Adobe After Effects, than to do it frame by frame in the tool. Shot-by-shot correction still takes a lot of time though, as each LUT is good for roughly 100 frames, and there are roughly 2,000 shots in a two hour film.

Thanks, I imported the generated LUT into Premiere and it applied instantly, with the cost of spike in RAM.

I’m testing it out on my dragon ball now, and I have a panning shot where there’re noticeable ringing artifacts along the lines. I have to turn smooth up to 0.9 in order to get it away. Is this normal? Can you take a look? I uploaded the reference and test image.

reference: http://i.imgur.com/id8AiRx.png
test: http://i.imgur.com/I54GI7O.png