### Mikes Notes

Nature recently had an interesting article about loss functions. This could be useful as a way to check Pipi.

### Resources

- Inside the maths that drives AI
- https://media.nature.com/original/magazine-assets/d41586-024-02185-z/d41586-024-02185-z.pdf
- https://en.wikipedia.org/wiki/Loss_function

### Nature

#### "Inside the maths that drives AI

Loss functions provide a mathematical measure of wrongness: they tell researchers how well their artificial-intelligence (AI) algorithms are working. There are dozens of off-the-shelf functions. But choosing the wrong one, or handling it badly, can create AI systems that blatantly contradict human observations or obscure experiments’ central results. Programming libraries such as PyTorch and scikit-learn allow scientists to easily swap and trial functions. A growing number of scientists are creating their own loss functions. “If you’re in a situation where you believe that there are probably errors or problems with your data … then it’s probably a good idea to consider using a loss function that’s not so standard,” says machine-learning researcher Jonathan Wilton.

..." - **Nature**

### Wikipedia

"In mathematical optimization and decision theory, a loss function or cost function (sometimes also called an error function) [1] is a function that maps an event or values of one or more variables onto a real number intuitively representing some "cost" associated with the event. An optimization problem seeks to minimize a loss function. An objective function is either a loss function or its opposite (in specific domains, variously called a reward function, a profit function, a utility function, a fitness function, etc.), in which case it is to be maximized. The loss function could include terms from several levels of the hierarchy.

In statistics, typically a loss function is used for parameter estimation,
and the event in question is some function of the difference between
estimated and true values for an instance of data. The concept, as old as
Laplace, was reintroduced in statistics by Abraham Wald in the middle of the
20th century.[2] In the context of economics, for example, this is usually
economic cost or regret. In classification, it is the penalty for an
incorrect classification of an example. In actuarial science, it is used in
an insurance context to model benefits paid over premiums, particularly
since the works of Harald Cramér in the 1920s.[3] In optimal control, the
loss is the penalty for failing to achieve a desired value. In financial
risk management, the function is mapped to a monetary loss. ..." -
**Wikipedia**

## No comments:

## Post a Comment