quinta-feira, outubro 15, 2009

Publicaram antes a minha idéia (e a do Dante Chialvo tamém)!

Quando Tiago estava trabalhando em seu mestrado, ele veio para a LASCON - Latin American School of Computational Neuroscience, em julho de 2008. Na cantina do Walter, Roque lembra que discutimos a questão do subsampling em sistemas críticos (inclusive com esta palavra). Sugeri ao Tiago que fizesse uma simulação em um sistema simples (por exemplo, a rede aleatória excitável que Mauro e eu trabalhamos em 2006, o modelo OFC (que tem relações com modelos de neurônios integra dispara) ou mesmo um modelo de pilha de areia tradicional.

Alguns meses mais tarde, Chialvo deu a mesma idéia para Tiago e Mauro (e fez questão de anotar em um guardanapo a a idéia, a data e as pessoas presentes, e pediu para todos assinarem!). Não sei se ele usou a palavra subsampling, acredito que sim. Desta vez, Tiago acreditou na idéia (o meu problema é que santo de casa não faz milagre...) e fez as simulações. Batata: subsampling em sistemas críticos gera distribuições que fogem às leis de potência, ficam mais parecidas com log-normais ou exponenciais extendidas.

Mas como o mérito é de quem publica primeiro... acho que Tiago, Mauro, Sidarta, Nicolelis, Chialvo e eu ficamos chupando o dedo... E o duro é que nem houve uma citaçãozinha (por exemplo, sobre o uso do parâmetro de branching sigma em modelos de SOC, acho que a Carmem Prado e eu fomos uns dos primeiros a explorar isso consistentemente (ou pelo menos popularizamos essa idéia...). Bom, é claro que tudo se esclarece com Vespignani, Zapperi, Dickman e outros, por exemplo neste belissimo paper com delicioso título:

How self-organized criticality works: A unified mean-field picture

Saiu no BMC:

Subsampling effects in neuronal avalanche distributions recorded in vivo

Viola Priesemann1,2 email, Matthias HJ Munk3 email and Michael Wibral4 email

1Department of Neurophysiology, Max Planck Institute for Brain Research, Deutschordenstrasse 46, D-60528 Frankfurt am Main, Germany

2Group for Neural Theory, DEC, Ecole Normale Supérieure, Collège de France, 3, rue d'Ulm, 75005 Paris, France

3Deptartment of Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Spemannstrasse 38, D-72076 Tübingen, Germany

4MEG Unit, Brain Imaging Centre, J.W. Goethe University, Heinrich Hoffmann Strasse 10, D-60528 Frankfurt am Main, Germany

author email corresponding author email

BMC Neuroscience 2009, 10:40doi:10.1186/1471-2202-10-40

The electronic version of this article is the complete one and can be found online at:http://www.biomedcentral.com/1471-2202/10/40

Received:1 October 2008
Accepted:29 April 2009
Published:29 April 2009

© 2009 Priesemann et al; licensee BioMed Central Ltd.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.



Many systems in nature are characterized by complex behaviour where large cascades of events, or avalanches, unpredictably alternate with periods of little activity. Snow avalanches are an example. Often the size distribution f(s) of a system's avalanches follows a power law, and the branching parameter sigma, the average number of events triggered by a single preceding event, is unity. A power law for f(s), and sigma = 1, are hallmark features of self-organized critical (SOC) systems, and both have been found for neuronal activity in vitro. Therefore, and since SOC systems and neuronal activity both show large variability, long-term stability and memory capabilities, SOC has been proposed to govern neuronal dynamics in vivo. Testing this hypothesis is difficult because neuronal activity is spatially or temporally subsampled, while theories of SOC systems assume full sampling. To close this gap, we investigated how subsampling affects f(s) and sigma by imposing subsampling on three different SOC models. We then compared f(s) and sigma of the subsampled models with those of multielectrode local field potential (LFP) activity recorded in three macaque monkeys performing a short term memory task.


Neither the LFP nor the subsampled SOC models showed a power law for f(s). Both, f(s) and sigma, depended sensitively on the subsampling geometry and the dynamics of the model. Only one of the SOC models, the Abelian Sandpile Model, exhibited f(s) and sigma similar to those calculated from LFP activity.


Since subsampling can prevent the observation of the characteristic power law and sigma in SOC systems, misclassifications of critical systems as sub- or supercritical are possible. Nevertheless, the system specific scaling of f(s) and sigma under subsampling conditions may prove useful to select physiologically motivated models of brain function. Models that better reproduce f(s) and sigma calculated from the physiological recordings may be selected over alternatives.

PS: Promessas para o futuro: 1) Levar bloco de notas e caneta na cantina do Walter. 2) Fazer eu mesmo as simulações, pelo menos os resultados provisórios. 3) Aprender com o Chialvo.

PS2: O fato de a mesma palavra (não tão comum) ser usada no mesmo contexto me sugere que alguém conversou no cafezinho com alguém, que conversou no cafezinho com alguém, que conversou no cafezinho com a Viola Priesemann. A rede social da ciência tem grau médio de separação igual a 3 ou 4. Isso é um perigo!Quem será que foi que não manteve a boca fechada? Muito provavelmente, fui eu mesmo... Ou então, é mais um caso de descoberta simultânea.

Update: Viola submeteu o trabalho em 1 de outubro de 2008. Eu falei com oTiago em julho de 2008 (update: na verdade foi em janeiro de 2008) e Dante teve a idéia meses depois, sem ter tido nenhum contato comigo. Logo, não houve tempo para difusão da idéia na rede social, eles não teriam tempo para fazer este (ótimo) paper com tantos detalhes e experimentos. Conclusão: foi um caso de a idéia estar no ar... a velha telepatia científica. A menos que Viola et al. sejam realmente rápidos em escrever e publicar...

4 comentários:

none disse...



Roberto Takata

dchialvo disse...

Hi O.

This is related with something I observe happens in science (at least in physics): There are spreaders who do not have any idea of doing the spreading...

Probably you know the trick of playing and beating simultaneous chess against an even number of other players. This is how it works: You make the promise of beating a large number of Masters, even though you are a lousy chess player. You allow any degree of expertise , let then bring the best chess players of the word... the only constraint is that the number of masters must be EVEN and you choose one additional player (which you select to be of your average chess level)
You seat the masters on a round table each one facing each other or any other way they can not see what is playing the other. Then you start the game with black and white pieces alternating.
The TRICK: your only job is placing the move of one master onto the board of the other master, except for the idiot (the player you choose, who is not a master)
Everybody will believe you are playing a simultaneous game against masters.
The chance is that masters of the same level will win 50% of the games, thus you have to beat only the idiot to make everybody believe that on the average you beat N masters + an idiot....
(with N as large as you wish)

What has to do with subsampling (and many other similar cases):

In physics, there is always someone (idiots inclusive), going to meetings where masters (or not) discuss ideas, they go to another meeting (or lab) and "suggest moves" which "locally" seems originated from that person.. (but are from the previous meeting). Of course this is beneficial for the progress, because the best players end up computing against each other (without knowing...).
In some cases the spreader got recognition as well
So if you trail a lot and keep your mouth shut about WHERE you heard WHAT, you will be considered a genius... (Per Bak called then prolific travelers)

Don Quixote said that "any society well organized need of a few parasites" in this case is the same, science needs of parasites "playing the simultaneous" , I recognize that some of the most cited statistical physicists have done precisely that for the critical phenomena field... and that could be a good thing

Warmest regards


Osame Kinouchi disse...

Dear Dante,

Nice observation!

However I suspected that Viola, being Argentine, listened your idea in some talk (or a bar) from yours.

It´s is surprisingly that both you and I used the word subsampling to refer to the measurement of few units activity when the power law only appears when N -> \infty, and that Viola used it also, almost simultaneously.

Marcelo Tragtenberg, a friend of mine, does not like the term "subsamplig" because "sampling" already conveys the idea that the sample is small or al least less than the total population. What you think? What subsamplig would convey as a new concept? The idea that, for distributions with long tail, simple sampling is not sufficient to measure the moments of the distribution?



dchialvo disse...

Dear Osame

I never heard of Viola in Argieland...
I recall my first conversation about this subsampling was with D.P @ NIH when I suggested that "taking few samples from a true (asymptotically) scale free process must be "different" from doing the same from a exponentially decaying pdf...
Also I suggested (and I did some calculations on a sand pile SOC model) that placing one electrodes to find out what is going on a slice of tissue exhibiting avalanches and computing some return time statistics would do the same trick, in the sense that on a scale free avalanching process a single site can remain unvisited for a long long time, while in a ergodic process the chances are higher...

At the time I was interested (still I am) to find out ways to "diagnose" criticality beyond the compiling of the usual PDF...

Compiling the PDF is too easy, but is like the autopsy, I wish we can do something more subtle, earlier and smarter, like the calculations we do in other dynamical systems of lower dimension... remember the old days of the Wolf algorithm for Lyapunov exponents from time series, G&Procaccia for dimension, etc... I think we should develop tools in the same direction and do something better than citing Clausen paper to best fit a power law...