This is nice, but I believe a simpler design could work better.
Simply make a model which transforms a 3d section of an image to an embedding vector. Make another model which can reverse the process (ie. encoder-decoder). Do that for every tile of a starting state.
Make an 'upscale' and 'downscale' model which can take a grid of embedding vectors and return a new vector representing the whole.
Then make an 'advance time' model, which takes an embedding vector and advances time by a given number of seconds/microseconds/days.
Now train all the models end to end to ensure that all combinations of upscaling/downscaling/advancing/encoding/decoding produce similar outputs to traditional physics models.
Use an ensemble of models or a sampling scheme to find places where outputs do not closely match, and insert more training data from the physical simulation at those points.
Given, the recent noise around this paper https://arxiv.org/pdf/2407.07218 about "weak baselines" in ML x CFD work, I wonder how it resonates with this specific work..
I am not super familiar with DEM, but I know that other particle based model such as SPH benefit immensely from GPU acceleration. Does it make sense to compare with a CPU implementation ?
Besides, the output of the NeuralDEM seems to be rather coarse fields, correct ? In that sense, and again I'm not an expert of granular models so I might be entirely wrong, but does it make sense to compare with a method that is under a very different set of constraints ? Could we think about a numerical model that would allow to compute the same quantities in a much more efficient way, for example ?
Some previous work by some of the same people [0] where I was taking part in. Seems like this is a significant step up from the previous work including a novel idea that resolves quite a few issues we had. Love to see that.
Interesting. I wonder what parts of this approach could be adapted to DEM models of solids. For those unaware - even though DEM is naturally chosen for fluids, depending on how you configure the force laws between particles you can easily model solids as well, where each particle is essentially a chunk of material. There are then some interesting choices to make about (a) what kind of lattice you set the initial particles up in, and (b) how you tune the force flaws to get the macroscopic properties you want around stiffness, etc and (c) if you use multiple “types” of particles forming a composite etc.
I'm sorry, but DEM is not for fluid simulation. It's used to simulate granular materials by default. Also the hopper discharge that is shown does not contain any fluid. The fluid is usually modeled using a different tool (e.g. using the finite volume method) which is then coupled to the particles.
Okay, fair, I was using fluid loosely (and inaccurately) to mean both granular and fluid behavior. But there’s nothing inherently incompatible between fluid dynamics and the discrete element method as far as I am aware, just like there is nothing inherently incompatible with solids. Sure SPH or LBM or FVM are the more traditional choices for fluids and computationally more tractable in most cases, but they aren’t necessarily “more right.”
Awesome paper on how powerful particle based methods can be:
This is nice, but I believe a simpler design could work better.
Simply make a model which transforms a 3d section of an image to an embedding vector. Make another model which can reverse the process (ie. encoder-decoder). Do that for every tile of a starting state.
Make an 'upscale' and 'downscale' model which can take a grid of embedding vectors and return a new vector representing the whole.
Then make an 'advance time' model, which takes an embedding vector and advances time by a given number of seconds/microseconds/days.
Now train all the models end to end to ensure that all combinations of upscaling/downscaling/advancing/encoding/decoding produce similar outputs to traditional physics models.
Use an ensemble of models or a sampling scheme to find places where outputs do not closely match, and insert more training data from the physical simulation at those points.
Interesting work.
Given, the recent noise around this paper https://arxiv.org/pdf/2407.07218 about "weak baselines" in ML x CFD work, I wonder how it resonates with this specific work..
I am not super familiar with DEM, but I know that other particle based model such as SPH benefit immensely from GPU acceleration. Does it make sense to compare with a CPU implementation ?
Besides, the output of the NeuralDEM seems to be rather coarse fields, correct ? In that sense, and again I'm not an expert of granular models so I might be entirely wrong, but does it make sense to compare with a method that is under a very different set of constraints ? Could we think about a numerical model that would allow to compute the same quantities in a much more efficient way, for example ?
Some previous work by some of the same people [0] where I was taking part in. Seems like this is a significant step up from the previous work including a novel idea that resolves quite a few issues we had. Love to see that.
[0]: https://ml-jku.github.io/bgnn/
Interesting. I wonder what parts of this approach could be adapted to DEM models of solids. For those unaware - even though DEM is naturally chosen for fluids, depending on how you configure the force laws between particles you can easily model solids as well, where each particle is essentially a chunk of material. There are then some interesting choices to make about (a) what kind of lattice you set the initial particles up in, and (b) how you tune the force flaws to get the macroscopic properties you want around stiffness, etc and (c) if you use multiple “types” of particles forming a composite etc.
I'm sorry, but DEM is not for fluid simulation. It's used to simulate granular materials by default. Also the hopper discharge that is shown does not contain any fluid. The fluid is usually modeled using a different tool (e.g. using the finite volume method) which is then coupled to the particles.
Okay, fair, I was using fluid loosely (and inaccurately) to mean both granular and fluid behavior. But there’s nothing inherently incompatible between fluid dynamics and the discrete element method as far as I am aware, just like there is nothing inherently incompatible with solids. Sure SPH or LBM or FVM are the more traditional choices for fluids and computationally more tractable in most cases, but they aren’t necessarily “more right.”
Awesome paper on how powerful particle based methods can be:
https://www.sciencedirect.com/science/article/pii/S187775032...
And a fun image of a DEM solid model of fracture:
http://www.cba.mit.edu/media/DEM/index.html