>>354

Yes, yes I could! Though there's still some disagreement on finer points, the definition I've heard given is:

*"Chaos is aperiodic long-term behaviour in a deterministic system that exhibits sensitive dependence on initial conditions."*

Aperiodic means that the system never gets back to where it started (e.g. an undamped pendulum, after swinging away and back again, ends up where it started, so it's not aperiodic) or just doesn't change (e.g. a pendulum hanging vertically, not swinging, is similarly not aperiodic). A deterministic system is one that completely obeys a set of well defined rules, with no randomness involved. Sensitive dependence on initial conditions means that if the same system starts from two slightly different states, they get exponentially more different as time goes on. You can measure this with what's called Lyapunov exponents.

Chaotic systems are famous for being extremely unpredictable (thanks to the sensitive dependence on initial conditions) without involving any actual random processes (because they're deterministic), which is pretty crazy when you think about it. It's also important to make clear that despite both being unpredictable, chaos is very different from randomness. Chaos isn't purely mathematical either; it shows up in many real systems, such as the weather, stock markets, animal populations, electrical signals, orbits in space, and heart fibrillation.

Though I'm not much of a programmer (I can just about fumble my way through Python and MATLAB and that's it) I know that some of the people reading this will be, so here's a little project that'll give you some idea what chaos is like. The following equation (known as the logistic map):

*x*_{n+1} = *rx*_{n}(1 - *x*_{n})

is used as a simple model for animal populations, where*x*_{n} is the population (expressed as a dimensionless quantity between 0 and 1) at the nth generation, and *r* is the replacement rate; a parameter which roughly corresponds to how fast/well the animal breeds. For *r* < 1, the population always goes extinct, which isn't very interesting.

Using a computer program, try investigating the long term (after a few hundred generations) behaviour of the system for different values of*r*. In particular, look at what happens for 1 < *r* < 3, then for 3 < *r* < 3.5, then for 3.5 < *r* < 3.57, and, finally, for 3.57 < *r*. To get a better idea what's going on, run the above system for many values of *r* between 1 and 4. Discard the first few hundred generations (while the system's still settling down to its long-term behaviour) and plot the next few hundred values of *x*_{n} on a plot of *r* against *x*. It doesn't matter what value of *x* you start with, so long as it's between 0 and 1. You should find that interesting things happen at *r* = 3, 3.5, ..., 3.57, and, perhaps, other values above 3.57.

For bonus can't-just-google-the-answer points, try repeating the above plot for:

*x*_{n+1} = *r*sinĪ*x*_{n}

If you want to know more about chaos from a non-mathematical perspective, you should read*Chaos: Making a New Science* by Gleick, or *Deep Simplicty* by Gribbin. For those of you who aren't afraid of a little calculus, I highly recommend *Nonlinear Dynamics and Chaos* by Strogatz. If you can do multivariable calculus and Taylor series, that's about all of the assumed knowledge you need.

Yes, yes I could! Though there's still some disagreement on finer points, the definition I've heard given is:

Aperiodic means that the system never gets back to where it started (e.g. an undamped pendulum, after swinging away and back again, ends up where it started, so it's not aperiodic) or just doesn't change (e.g. a pendulum hanging vertically, not swinging, is similarly not aperiodic). A deterministic system is one that completely obeys a set of well defined rules, with no randomness involved. Sensitive dependence on initial conditions means that if the same system starts from two slightly different states, they get exponentially more different as time goes on. You can measure this with what's called Lyapunov exponents.

Chaotic systems are famous for being extremely unpredictable (thanks to the sensitive dependence on initial conditions) without involving any actual random processes (because they're deterministic), which is pretty crazy when you think about it. It's also important to make clear that despite both being unpredictable, chaos is very different from randomness. Chaos isn't purely mathematical either; it shows up in many real systems, such as the weather, stock markets, animal populations, electrical signals, orbits in space, and heart fibrillation.

Though I'm not much of a programmer (I can just about fumble my way through Python and MATLAB and that's it) I know that some of the people reading this will be, so here's a little project that'll give you some idea what chaos is like. The following equation (known as the logistic map):

is used as a simple model for animal populations, where

Using a computer program, try investigating the long term (after a few hundred generations) behaviour of the system for different values of

For bonus can't-just-google-the-answer points, try repeating the above plot for:

If you want to know more about chaos from a non-mathematical perspective, you should read