Author Topic: Designer and UE4 rendering results differ (sampler issue?)  (Read 370 times)

I recently found UE4 rendering result differs from designer.
I would like to report this as I could reproduce this in a very simple material.

In the following comparison, left hand side from Designer looks just fine, while right hand side's UE4 texture result looks lot  low-res, despite their resolution are both 2k.

I suspect this comes from pixel processor.
In the attached sbs graph, I made gradients and used their value as x and y input of sampler1 (this allows optimizing transformation in very complex graph).

I would like to know if this is just by design(part of optimization) or whatnot.

Thank you in advance!

I would first export the SD files to an 16bit PNG, confirm in Photoshop that they got the quality you expect. Then import those in UE4. UE4 applies some default compressions. In your screenshot it is saying 'Default DX1/5'. You can change that but that will cost performance.
(this workflow assumes that you skipped importing the SBbar directly into UE4, if the images are correct and look like what you expect, then indeed something goes wrong when importing the SBS file in UE4 instead of the images)

Hi tom_14
Thanks a lot for your insight!

As you suggested, I exported the output as png and imported it into UE4 as a static texture image.
And it seems that the exported static texture doesn't have the same issue ???.

I attached a comparison of png and substance texture on UE4 below, zoomed in a little so that the difference becomes more obvious.

Thanks again for your help :D!

I recently found a solution for this.

I just needed to switch engine from CPU to GPU, and refresh textures(by altering $randomseed and what not).
I imagine CPU engine has a little problem mimicking GPU operations.

Yes, there are some differences between the CPU and GPU engines. In your case, the differences are probably caused by the fact that the CPU engine only supports 16bits grayscale textures and 8bpc color textures. Since you are using a color texture to read a position for later sampling another texture, the position read from the first texture will be quantized to 8 bits so allowing for at most 256 values, causing visible blocks as if the second texture's resolution was reduced to 256x256. If you need to use the the CPU engine, a solution would be to use 2 grayscale textures for storing the initial position instead of a single color texture. Another possible mitigation would be to use linear interpolation in the second sampler to reduce the "blockiness" if the first texture is smooth enough.
Last Edit: January 31, 2018, 12:09:58 pm

Thank you for the detailed explanation!

I knew Flood Fill is not working with CPU Engine but didn't quite know why, and that explains it too!
In my case, I will just stick to GPU engine but definitely will keep that in mind.

Thanks again :)