Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - CodeRunner

Pages: [1] 2 3 4
1
If you are trying to create a single segment of texture that will be reused on many modules of road, there is really no way to fix the tiling problem other than to use less obvious grunge/dirt patterns.

But if you are trying to vary the texture repetition as it continues down the road (using unique textures on all of the road), there are many ways to fix that. One method is to use multiple "grunge" layers on top of each other, at different scales. For example, your base layer would be your road paint, with no noise (just flat). Then add one grunge/noise effect on top of that, set to multiply/overlay/etc, at a scale of (something around) 2.0. Add a second grunge/noise on top of that at a scale of 1.0. Add another scaled at 0.5. And tweak the scaling as necessary to make it look good.

If you are in the first situation, you should also consider that your texture will look more repetitive farther away. Up close, it won't be so noticeable. You just have to keep tweaking your concrete noise until it looks less obvious from the distance the camera will be at.

A third option is to create 2 versions of each road type and randomly choose between them. 3 would be better. 4 even better. Etc.. All versions would need to be identical around the edges so they can connect.

One last idea - you can also create your road textures without much concrete noise at all in Painter, then blend a noise detail texture over all roads in your game/app/finished work, based on location, etc. This way, no two roads will look the same.

2
Not sure if this is a bug or feature request. When you use the copy/paste commands to copy and paste a mask, it does not maintain the anchor point relationships. All of the pasted anchor points will be broken.

If there is any way to maintain the links while pasting, I think it would make a lot of sense to do so.

Thanks!

3
SubstanceSubstance - Discussions - General PBR Metallic Question
 on: February 22, 2019, 03:45:47 pm 
I have a general question about the realistic level of metallic on surfaces like painted metal and rust. Does paint (over metal) remove 100% of the metal level? If not, how does one estimate the level of metal loss?

If its all about reflecting light, I'm assuming the paint would completely change the metal level. But I'm just not sure I understand by how much. Maybe it depends on the type of paint? How much the paint allows light to penetrate to the surface? But does this modification completely eliminate metallic and just change it into glossiness/roughness?

I also have some questions about rust, but this is probably something I can figure out by researching rust itself.

Thanks for any advice!

4
Substance DesignerSubstance Designer - Discussions - Re: UV Seam Repair Filter
 on: February 22, 2019, 12:25:11 am 
Honestly, I think the lack of loops is just them protecting us from hangs. Logic wise, it would literally be the same as repeating the same function node x number of times. Any data that changes inside the loop is no different than a parameter being passed to that function and/or returned from it.

If the iterations value was hard-coded, they could literally unroll our loops to provide the feature without any additional functionality to their engine. I have a feeling this will be the first type of looping they provide, if they do.

I don't know enough about Designer to be positive about anything, but that's my best guess.

5
This is quite a convoluted function indeed..
Yeah, sorry, my first attempt at the pixel processor, and any non-trivial function.

There was some unplugged inputs
I noticed that too after I uploaded, and a few math errors. I was making a lot of alterations to attempt to make sense of what was happening, so I had a feeling some shoes were left untied. Sorry about that.

Fyi, the "switch" node does not optimizes the generated code, all functions are in-lined and all the nodes will be computed.
I noticed it seemed to function that way, but was hoping it was an interface thing, and not really the logic behind the scenes. That's a bummer. That explains why it's hitting a limit then, and explains why unplugging some of the switch options fixes it.

I was under the impression that (general) code compilation doesn't require such logic to be processed if their respective conditions are not met. Is this something specific to the pixel processor? Something related to graphics processor limitations? And does it apply to all conditional branches? For example, if I have some code such as "if( X ) then do Y()", and Y is completely unsafe to do unless X is true, is the entire thing unsafe to include?


It means all the "pixelSample" function is instantiated in total 15552 times if my maths is correct, which results in 62208 samples in total. That's quite a lot.
Yep. It was written with the assumption that only logic in true conditions would be processed. Now that I realize this isn't true, I may be able to figure out an alternate strategy.

Having everything in a single pixel processor is not a good idea, and also you are not reusing results when computing larger kernel.
You'll notice there was another pixel processor in the graph. They were actually originally the same processor, so it was much worse originally. I'm sure I can split them up, now that I realize how it works internally.

You should split your algorithm into multiple pixel processors, maybe reuse results for computing larger kernel (not sure if it's possible ?). Also maybe there are better algorithm to compute what you are looking for (what is it exactly ?).
Trying to feel my way around was the primary purpose of it. It's just an interpolation node. You plug in two inputs and it interpolates between the two in different ways. For creating grunge maps or such. Mixing noises, etc. Most of the logic was removed to simplify what you needed to dig through, but the return value of the code you saw was intended to be used as an interpolation factor for a lerp operation. There's no logic in there that's important enough to fix. I just needed to understand why it was failing.

Thank you very much for your help. I appreciate you taking the time to look at it.

6
If I were to guess, it looks like I may be running out of some type of internal memory. If I unplug the bottom two connectors from my 8-choice depth switch, it works fine. And it will fail again even if I plug some constant zeroes into those bottom two connectors. As far as I know, those bottom connectors shouldn't even influence anything when the switch variable chooses the top-most option. I guess it depends on how it works internally.

In many cases, I can return at a certain node and all is well, but if I try to add any type of operation onto that and return after, it fails. And in many cases where it fails, just returning from an earlier point in the function makes it work.

So yeah, I think I'm running out of memory. Is there any type of memory setting in here that I can crank up? I'm only intending to use Substance offline, so I'm not worried about compatibility with any engines.

Thanks again

7
This is more of an unintuitive setup situation than a bug. When you attach a clamp function node onto an existing value (select any value node, press tab, choose clamp node), it connects the existing value node to the clamp node's MAXIMUM connector, rather than it's INPUT connector, like one would expect.

Just something that may lead to buggy graphs if someone was in a hurry, in my opinion.

8
Sure, I can upload the graph and functions. It's a bit convoluted, and I didn't want to (seemingly) ask someone to debug my graph for me. I was hoping there was a simple explanation, like maybe I'm accessing something I shouldn't.

If you want to see the strangeness as it stands, open Interpolation_Test.sbs. Ignore the output node and just check the result of the bottom pixel processor. If you look at the bottom pixel processor's edit function, the variable Depth_Test is supposed to control the number of pixels that get tested in that routine, but it doesn't seem to have any effect at all when I change the value of Depth_Test. However, if I plug in the constant integer value node below it and change that, it seems to work. But even then, if you change that value to 5, you get a strange gradient result. Change to 6 and you get pure white.

There's a lot of repeat-like functions to scan various pixels, but the root scanning function is pixelVarianceScanRoot. I'm wondering if I'm not doing something wrong in there to cause the strangeness. Maybe writing to pixels outside of the memory I should be accessing. It's one of the few explanations I could come up with that makes any sense.

I've only been using Designer for a week or so, so still feeling my way around, and probably making some mistakes. Thanks for checking it out.

9
I've been trying to debug my function graphs for a few hours now, but I'm using "set as return value" to trace through my function logic progress, and I'm getting really strange results.

For example, if I have nodes setup like A->B->C, and returning at C gives me a clear image, but then dividing C by 2 and returning there gives me a competely black or white (or even a gradient sometimes) image, is this a sign of some internal failure or limitation?

Even if I replace the divide by 2 with something like adding 1, I still get the strange behavior, so it's not related to the division. I'm wondering if I'm hitting some type of internal memory limit. My graph is attempting to sample pixels around the active pixel to determine how varied the area is. And it works great when scanning 32 pixels away, and 64 pixels away. But I've hit this wall when I try 128. It's a shame, because I was really liking the effect, but I can't take it to large scale until I figure out what's happening.

If this is not likely, then its possible that I just have a strange bug in one of my function graphs somewhere.

Any advice or just a point in the right direction for the documentation would be appreciated, thanks!

10
Yeah, that would be great, because we currently have to temporarily override the graph format to operate in any format other than 8-bit, which can be problematic if you forget to change it back to relative before you save.

For example, If I'm writing a gray scale processing graph of any kind, I'm always going to want to use 16-bit rather than 8, because of all of the quality loss issues of 8 bit. That means I need to override the graph format pretty much every time I create a new node, which means over half of my custom graphs probably currently have the format overridden to 16 bit, whereas I would much rather they adapt to whatever format is plugged into them.

Just a convenience issue, I guess.

11
Substance DesignerSubstance Designer - Discussions - Re: UV Seam Repair Filter
 on: February 18, 2019, 11:58:06 pm 
I'm surprised someone hasn't found a way to hack in loops? I tried myself by creating recursive functions, but it appears Designer prevents that completely. I can't even use function A in B if A already calls B in any way. I even tried 3 levels deep.

I'm not understanding the need to protect us from loops. Having the program hang in an infinite loop once in a while is much better than copying and pasting the same code 16 times, and being completely cut off from certain functionality.


12
It would be very helpful if it were possible to change function parameters via UI controls like normal graph nodes.

For example, if I needed a function that blended two samples, and one of that function's parameters was "blend mode", (int), I could plug in all of the parameters that I want to control dynamically with connectors, then choose the blend mode manually via a drop-down control, since it will never change in my situation.

Since nodes already allow optional parameters, I don't think this will cause any harm. It essentially allows the graph writer to set the "default" value of unplugged parameters.

13
Does anyone know if its possible to set the parent bit depth of the graph? IE, we have the drop down at the top to select the parent resolution, but I don't see any controls to set the bit depth. Still looking, but haven't found this info in the documentation yet.

14
I figured it out. I was not setting my saved graph to be relative to parent. For some reason I had it in my head that input nodes could be set to be relative, but not graphs themselves. But it looks like both need to be set to be relative to parent in this case.

Thanks again for the help.

15
However, when it comes to baking the mesh maps, am I going about it correctly by freezing the subd mesh into a high polygon mesh and using that as my high resolution mesh within SP?
I've never used Modo, but it's a pretty straight forward process. As long as you're exporting your high poly mesh without any surface issues, it should work. My typical work process (on a non-sculpted mesh) is to build a low poly version first, then use sub-division and control loops to define the edges the way I want. That's it.

Sometimes I reduce the low poly further by destroying its perfect quads into triangles, but wow do I hate that part. Someone should build a tool that allows you to create a single quad-based low poly model and automatically define your low and hi poly outputs based on it. Maybe by defining a global edge sharpness and then manually adjusting certain other edges. No control loops, no triangles. Completely non-destructive. I need to build that.

My low poly models are usually around 500 polygons for things like props, and 2K-5K for characters. Hi poly doesn't matter much, but I usually try to keep it below a million.

Pages: [1] 2 3 4