0
$\begingroup$

Good day everyone, not sure if its the right place to ask, but any help would be greatly appreciated!

As a quick explanation, I am working on spintronics in epitaxial systems. The usual methods of data analysis, at least for my measurement technique, assumes an isotropic system.

Of course, the standard way does not work for me as my samples are crystalline, however I wrote a simulation that is able to generate the right signal for the correct input parameters (this was tested on standard systems using the values extracted with the standard technique, and a few tests on the epitaxial systems where I manually changed the simulation parameters to get a good match).

Therefore, I was wondering if anyone knew of ways to efficiently iterate over a simulation (each simulation takes about 15 seconds) while varying the simulation parameters to match the output to the experimental data and extract the information I am interested in this way

edit: thank you guys for the answers, unfortunately steepest descent would require knowing how the function looks in parameter space, which is the issue I wanted to avoid having to run billions of simulation to map parameter space in order to do it, mostly because I would not know how to use it, as it would probably require a neural network to work properly and I have no idea how to set it up

$\endgroup$
7
  • $\begingroup$ Steepest descent? See these 3blue1brown videos on Neural Networks $\endgroup$ Commented Aug 3, 2024 at 14:35
  • 2
    $\begingroup$ Conjugate gradient (if it can be used here) would converge faster than steepest descent (despite the latter's name... something like requiring only the square root of the number of steps, IIRC) $\endgroup$ Commented Aug 3, 2024 at 15:37
  • 1
    $\begingroup$ Basically an optimization/minimization problem; python is full of them; presumably other useful scripting languages have similar packages you can import for this? $\endgroup$ Commented Aug 3, 2024 at 17:50
  • $\begingroup$ Some ideas: genetic algorithm (allows a search without knowing the gradient), Bayesian optimization (similar concept but based on Bayesian statistics), build a fast surrogate model for your simulations and then run optimization in parameter space using the surrogate model instead of the full simulation. Another question is what is the dimensionality of your parameter space, complexity tends to be exponential in that parameter so the strategy you use will depend on that. $\endgroup$ Commented Aug 5, 2024 at 14:01
  • 1
    $\begingroup$ Re: the edit. You misunderstand the method. You don’t have to map parameter space with billions of evaluations. You just run the algorithm and it explores as little of the parameter space as it can. A genetic algorithm or something similar may be better if you expect there to be lots of local minima. $\endgroup$ Commented Aug 5, 2024 at 15:00

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.