About | Research | Events | People | Reports | Alumni | Contact | Home
Wednesday May 18
09.00 - 09.15 | Registration (Coffee/tea) | |
e-Humanities(1) | ||
Human behaviour in the web X.0 age - links to e-humanities | ||
09.15 - 09.30 |
Peter Richmond + local organizers |
Welcome to the Annual Meeting |
09.30 - 10.00 | Vincent Buskens, Milena Tsvetkova | Egalitarian Networks from Asymmetric Relations: Coordination on Reciprocity in a Social Game of Hawk-Dove |
10.00 - 10.30 | Eric Postma | Van Gogh's Uncertainty Principle |
10.30 - 10.50 | Renaud Lambiotte | The Personality of Popular Facebook Users |
10.50 - 11.10 | Coffee/tea | |
11.10 - 11.30 | Yurij Holovatch | Collective behaviour in complex networks: scaling and beyond |
11.30 - 12.00 | David Goldberg | Connected Learning |
12.00 - 12.20 | Carmen Costea | The fall of HR and the rise of excellence in human beings behavior |
12.20 - 13.40 | Lunch and POSTER SESSION | |
Global Crises | ||
13.40 - 14.10 | Damien Challet , Sorin Solomon, Gur Yaari | The Universal Shape of Economic Recession and Recovery after a Shock |
14.10 - 14.30 | Jürgen Mimkes | The econophysics of autocracy and democracy in the Arab world |
14.30 - 14.50 | Panos Argyrakis | Crisis spreading in world countries |
14.50 - 15.10 | Coffee/tea | |
15.10 - 15.40 | Cecilia Vernia | Inverse problem for interacting models in social sciences |
15.40 - 16.00 | Giulia Rotundo | Complex principal component analysis of directed networks |
16.00 - 16.20 | Ingve Simonsen | Persistent collective trend in stock markets |
16.20 - 16.40 | Vygintas Gontis | Minimal agent based models as a microscopic reasoning of nonlinear stochastic models |
17.00 - 18.30 | MC Meeting | |
19.00 - | Conference dinner |
Thursday May 19
Friday May 20
09.00 - 09.15 | Coffee/tea | |
Infophysics | ||
09.35 - 09.55 | Araceli N. Proto | Classical, Semiclassical and Quantum Models for Understanding Human Systems |
09.55 - 10.15 | Oleg Yordanov |
Dynamics of public opinion under different
conditions |
10.15 - 10.35 | Marcel Ausloos | A generalized Verhulst approach of population evolution |
10.35 - 11.00 | Coffee/tea | |
e-Humanities(2) | ||
Culture as a complex system | ||
11.00 - 11.20 | Suchecki, K.; Scharnhorst, A.; Akdag-Salah, A.; Gao, C. | Evolution of the Wikipedia category structure |
11.20 - 11.50 | Paul Wouters | Modelling research practices - the case of peer review |
11.50 - 12.20 | Sally Wyatt | On track: living and measuring everyday complexity |
12.20 - 13.50 | Lunch | |
13.50 - 14.20 | Sorin Solomon | Common Creativity Dynamics Patterns in sciences and humanities |
14.20 - 14.50 | Koen Frenken (together with Luis Izquierdo & Paolo Zeppini) | A branching-and-recombination model of innovation |
14.50 - 15.20 | Hans Kamermans | Predicting the Past for Present Purposes |
15.20 - 15.50 | Coffee/tea | |
15.50 - 16.20 | Franciska de Jong, Stef Scagliola | Enhanced publications, e-humanities practices and multi-media data |
16.20 - 16.50 | Charles van den Heuvel | Idea Collider: Elementary knowledge structures in the humanities |
Panos Argyrakis
Crisis spreading in world countries
We study the spreading of a crisis, such as the world crisis of the past few years and we model such events by constructing a global economic network which uses financial information about the economic relationships between the different countries of the world. We then spread this crisis from a single point of origin to the entire network by utilizing a the Susceptible-Infected-Recovered (SIR) epidemic model with a variable probability of infection. Each country can be infected by its neighbor country in the network with a certain probability. The probability of infection depends on the strength of economic relations between the pair of countries, and the strength of the target country, as it is natural for the economic dependence of one country on its trading partners. The actual data that we use involve two different sets: (1) The import-export data for all countries of the world and (2) the number of subsidiaries that each private company has established in ot her countries, by considering the 5000 largest world corporations. It is expected that a crisis which originates in a large country, such as the USA, has the potential to spread globally, such as the recent crisis that originated in the mortgage sector but spread practically in the entire banking industry. Surprisingly we show that in addition to the large economic powers in the world, countries with much lower GDP, such as Belgium, are also able to initiate a global crisis. Using the k-shell decomposition method to quantify the spreading power (of a node), we obtain a measure of ``centrality'' as a spreader of each country in the economic network. We thus rank the different countries according to the shell they belong to, and find the 12 most central countries. These countries are the most likely to spread a crisis globally. Of these 12 only six are large economies, while the other six are medium/small ones, a result that could not have been otherwise anticipated. Furthermo re, we use our model to predict the crisis spreading potential of countries belonging to different shells according to the crisis magnitude. In comparing the results with the actual situation of the economies of all countries, we find a pretty good agreement between this new model and the extent of involvement of the crisis in each country.
Marcel Ausloos
A generalized Verhulst approach of population evolution
Within Verhulst approach of population evolution, a discussion is presented taking into account that growth can neither be infinite nor reach a steady state at an inifinite asymptotic time. Some leveling-off should occur followed either by growth again or some decay. Examples are analyzed.
Vincent Buskens, Milena Tsvetkova
Egalitarian Networks from Asymmetric Relations:
Coordination on Reciprocity in a Social Game of Hawk-Dove
Asymmetric relations such as lending money, doing favors and giving
advice form the basis of mutual aid and cooperation in human societies.
However, they also provide a mechanism for the emergence of inequalities and
hierarchies. Reciprocal behavior at the dyadic and network levels can
prevent the aggregation of unequal exchange into unfair macro-level
outcomes. In this paper, we investigate the conditions under which a group
of individuals is more likely to develop a social norm of reciprocity and
coordinate on efficient and egalitarian structures from asymmetric dyadic
relations. We present findings from a laboratory experiment on a version of
the Hawk-Dove Game in which subjects interact repeatedly by choosing both
their partners and actions towards each of them. Our results indicate that
smaller groups are more likely to coordinate on egalitarian equilibria than
larger groups and that norms of direct reciprocity are more likely to emerge
than norms of indirect reciprocity. We also discover that although the
equilibria are egalitarian in terms of payoffs, they imply a dominance
hierarchy regarding the distribution of actions.
Carmen Costea
The fall of HR and the rise of excellence in human beings behavior
People, no matter their age, are more and more
subject to intense learning processes often crowded with redundant or
obsolete information, interspersed with challenging exams at all level.
Unable sometimes to provide the appropriate information, teachers and
Governments’ representatives are lost in inappropriate laws or agreements,
long political talks because they have not the time to see the real benefit
of training and education, they are not able to handle how to structure it,
to monitor a project or to achieve the right goals.
Thus, educational inheritance remains simple remakes of undefined
theoretical issues about the useless areas, disconnected from the real
market potential and opportunities. Annually, the European Union spends huge
funds with the main purpose to capacitate students learning towards
excellence, to adapt ideas to their IQ and environment, to cooperate with
their teachers as members of performing teams, to restructure the education
from its fundaments. As lots of students choose their careers based on
superficial or boomy reasons, their development goes along with fake
responsibility; creativity is still understood as a copy-paste process; the
funds release in dissonant consulting activity is more loss of time and
money than motivation and real engagement. Upon graduation many adults
become limited, unhappy, opportunistic careerists working hard to earn their
bread and for whom their employment can not ever give the satisfaction of
accomplishment. This is why most of reconversion programs fails and the
funds are spent without any achieved goals.
Often, the management theories, job ads, recruiting firms focus on qualities
that can never be met by one person; they are looking for successful people,
pro-active, intelligent individuals, sound professionals possessing
inspirational leadership, good abilities to motivate teams, ability to work
hard and under pressure, available after working time, handling several
foreign, dynamic, competitive, etc.
This specialist portrait more likely fictional is increasingly rare in our
new mercantile societies dominated by the small screen trends and models.
Once the previous probation period gone they were replaced by short training
periods under the European structural financement. During them, only a small
number of newly hired staff is able to adapt to the company and job
requirements as young learners for to find out the “secrets” of their chosen
profession.
This is why we consider the need of a new paradigm in education, mainly in
business education (BISOU) to fundamentally twist towards spiritual economy
and human society. This is the single way to bring a new value system on
which building appropriate models represent the challenge. The new approach
is evolving as a dynamic, open system, related to ecological and social
aspects of our days.
The educational variables that we analyze and interpret are linked to
environmental and social variables of other systems related to the economic
life. This requires a multidisciplinary and dynamic approach, integrating
economic life ideas, with theories of ecology, sociology, political science,
anthropology and psychology abilities proved as practical behavior. New
economic science concepts about the health of the economy should be
developed through inter-disciplinary pedagogy and research. The redefinition
of a few basic economic and educational concepts in terms of the new
paradigm would mean a real re-spiritualization and operational step toward
identifying the value system that defines the new vision system.
Rethinking behavior in terms of new models cannot only remove their basic
error - using money as the only variable to measure efficiency - but also
introduce a new set of concepts and variables that are generated by the
interaction with the ecological aspect of economy and society. It seems that
measuring the efficiency of production processes in terms of net energy is
more reliable than the macroeconomic analysis of monetary approaches. Such
approach, initiated in Physics with Nicholas Georgescu Roegen work, tries
out the entropic relevance of social systems life. The explanations of
social friction dissipating unproductively the energy and resources become
obvious and necessary at educational level to better support sound
competition and reduce the conflicts.
Santo Fortunato
Characterizing and modeling online popularity
Koen Frenken
Coauthors: Luis R. Izquierdo, Paolo Zeppini
A branching and recombination model of innovationA branching and recombination model of innovation
To explain the dynamics of technological transitions, we develop an agent based model with network externalities and two different types of innovations. Recombinant innovations create short-cuts which speed up technological progress, allowing transitions that are impossible with only branching innovations. Our model replicates some stylized facts of technological transitions, such as punctuated equilibria, path dependency and technological lock-in. We find analytically a critical mass of innovators for successful innovations and technological transitions. Recombinant innovation counters network externalities, and calls for technological diversity as a key feature of technological transitions. An extensive simulation experiment shows that stronger network externalities are responsible for S-shaped utility and technological quality curves, indicating that a threshold of innovation probability is necessary to boost innovation. We finally introduce a policy view and interpret the innovation probability as the effort to foster technological change. A welfare measure including innovation costs presents an optimal interior value of innovation effort. The optimal innovation effort is strongly correlated with the number of recombinations, which further indicates how recombinant innovation is important in achieving a sustained technological progress at relatively low costs.
Serge Galam
Tailor Based Allocations for Multiple Authorship: a fractional gh-index
A quantitative modification to keep the number of published papers invariant under multiple authorship is suggested. In those cases, fractional allocations are attributed to each co-author with a summation equal to one. These allocations are tailored on the basis of each author contribution. It is denoted "Tailor Based Allocations (TBA)" for multiple authorship. Several protocols to TBA are suggested. The choice of a specific TBA may vary from one discipline to another. In addition, TBA is applied to the number of citations of a multiple author paper to have also this number conserved. Each author gets only a specific fraction of the total number of citations according to its fractional paper allocation. The equivalent of the h-index obtained by using TBA is denoted the gh-index. It yields values which differ drastically from those given by the h-index. The gh-index departs also from the one recently proposed by Hirsh to account for multiple authorship. Contrary to the h-index, the gh-index is a function of the total number of citations of each paper. A highly cited paper allows a better allocation for all co-authors while a less cited paper contributes essentially to one or two of the co-authors. The scheme produces a substantial redistribution of the ranking of scientists in terms of quantitative records. A few illustrations are provided.
David Goldberg
Connected Learning
This talk discusses a range of contemporary creative projects across digital humanities that are computationally driven. The talk raises questions about the presumption of unidirectional contributions from the physical sciences, mathematics, and computation in solving humanistic problems and argues rather for a more connected, collaborative, and interactive undertaking to address complex issues that are marked by irreducible technological, social, and cultural dimensions.
Vygintas Gontis
Coauthors: Aleksejus Kononovičius and Bronislovas Kaulakys
Minimal agent based models as a microscopic reasoning of nonlinear stochastic models
Recently we introduced a double stochastic process driven by the nonlinear scaled stochastic differential equation reproducing the main statistical properties of the return, observed in the financial markets [1, 2]. The proposed model is based on the class of nonlinear stochastic differential equations, providing the power-law behavior of spectra and the power-law distributions of the probability density [3, 4]. Stochastic framework mainly gives only a macroscopic insight into the modeled system, while microscopic behavior currently is also of big interest. In this contribution we will provide a version of agent based herding model with transition to the nonlinear stochastic equations of trading activity and return in financial markets. We have modified Kirman’s ant colony agent based model [5] and introduced the trading activity as a measure of agent interaction frequency. This results in nonlinear stochastic differential equations for return adjustable to the expected statistical properties.
[1] V. Gontis, J. Ruseckas, A. Kononovičius,
Long-range memory stochastic model of the return in financial markets,
Physica A 389, 2010, p. 100 - 106, arXiv:0901.0903v3 [q-fin.ST]
[2] V. Gontis, J. Ruseckas, A. Kononovičius. A Non-Linear Double
Stochastic Model of Return in Financial Markets, Stochastic Control, Chris
Myers (Ed.), ISBN: 978-953-307-121-3, Sciyo, 2010, p. 559-580.
[3] B. Kaulakys, J. Ruseckas, V. Gontis, M.Alaburda, Nonlinear stochastic
models of 1/f noise and powerlaw distributions, Physica A 365, 2006, p.
217-221, arXiv: cond-mat/0509626v1 [cond-mat.statmech].
[4] J. Ruseckas, B. Kaulakys, 1/f noise from nonlinear stochastic
differential equations, Physical Review E 81, 2010, 031105, arXiv:
1002.4316v1 [nlin.AO]
[5] A. P. Kirman, Ants, rationality, and recruitment, Quarterly Journal of
Economics 108, 1993, p. 137-156.
Igor Grabec, Franc Švegl
Forecasting development of traffic jam at a high-way bottleneck
Maintenance works on high-ways require installation of bottlenecks that cause instabilities of traffic flows and development of terrible jams. In order to provide for analysis of corresponding traffic instability and minimization of its influence, road operators have to forecast the properties of traffic jam in dependence of a planned bottleneck structure. The article presents an intelligent system that has been recently developed for this purpose. The system includes a non-parametric statistical model for forecasting of traffic flow rate on roads network in Slovenia and an analytical model for prediction of traffic jam evolution at a bottleneck. For this purpose a new fundamental diagram of traffic flow was developed that provides for accounting of bottleneck characteristics. Performance of the corresponding computer program is demonstrated using records of traffic flow rate at a point of maximal traffic activity on a high-way close to city Ljubljana.
Charles van den Heuvel, Richard Smiraglia
Idea Collider
Elementary knowledge structures in the humanities
The debate on atoms and voids in the universe (atomism) had an important impact on the emerging discipline of library science in the 19th century in which the universe of knowledge was a regular used metaphor. The use of this metaphor got a new impetus with the first formulations of the relativity and quantum theories. At the beginning of the 20th century classificationists such as the Belgian, Paul Otlet (1868-1944) visualized thought experiments in which knowledge contained in all sorts of documents was broken down into the tiniest elements and recombined in new knowledge structures. During the tests with the Large Hadron Collider in CERN in 2008, Richard Smiraglia (University of Wisconsin, Milwaukee) and Charles van den Heuvel set up a similar thought experiment to develop an Idea Collider that enables the identification of elementary structures of knowledge and its reconstruction along structural (syntactical) lines rather than with an semantic approach alone. In this paper I will discuss its potential for information retrieval in the humanities.
- Heuvel, C. van den, & Smiraglia, R.P. (2010). “Concepts as
Particles: Metaphors for the Universe of Knowledge”,in Gnoli, C. and Mazzocchi,
F. (Eds.), Paradigms and conceptual systems in knowledge organization:
Proceedings of the Eleventh International ISKO Conference, 23-26 February 2010
Rome Italy, Ergon-Verlag, Würzburg, pp. 50-56.
- Smiraglia R. P.; Heuvel, Charles van den (2011), Idea Collider: From a theory
of knowledge organization to a theory of knowledge interaction. Bulletin of the
American Society of Information Science and Technology, April/May 37 (4), pp.
43-47
Jaroslav Hlinka
What can we learn from complex spatiotemporal dynamics of brain activity?
Human brain is an iconic example of a complex system. In recent decades,
neuroimaging research has gathered an exponentially increasing wealth of
data and results documenting its complex spatiotemporal behaviour. The
motivations have stemmed from the impact our understanding of brain function
and disease has on human well-being and socio-economical prosperity, but
also from the deeply human fascination by the question of our identity.
Apart from strictly neuroscientific findings, the increasingly
multidisciplinary area of brain research provides methodological and
theoretical links to other fields of complex system study, fostering
inter-disciplinary discussion and research. The presentation will give an
overview of challenges regarding complex spatiotemporal properties of brain
activity. Some of the highlighted features will include multi-scale temporal
activity, with nested fast and slow oscillations and 1/f-type frequency
distribution, and dynamic formation of functional networks based on a stable
structural connectivity substrate.
However, the ultimate question is how a system of cooperating and/or
competing (neural) populations can lead to an emergence of robust but highly
flexible, effective and coordinated behaviour within a dynamic environment.
We believe some of these questions, related to self-organization, are highly
relevant for the whole field of complex systems.
Remco van der Hofstad
Shortest-weight problems on random graphs
We investigate shortest-weight problems on the
configuration model, in which flow passes through the network minimizing the
total weight along edges. In practice, one is both interested in the actual
weight of the minimal weight path, which represents its cost, as well as the
number of edges used or hopcount, as this is often a good measure of the delay
observed in the network.
We assume that the edge weights, which represent the cost of using the edge in
the network, are independent random variables.
We then investigate the total weight and hopcount of the minimal weight path. We
study how the minimal weight and hopcount depend on the structure of the edge
weights as well as on the structure of the graph. We present some recent results,
as well as conjectures related to weak and strong disorder.
The above research is inspired by transport in real-world networks, such as the
Internet. Measurements have shown fascinating features of the Internet, such as
the `small world phenomenon'. The small world phenomenon states that typical
distances in the network under consideration is small.
Also, the degrees in the Internet are rather different from the degree structure
in classical random graphs. Internet is a key example of a complex network,
other examples being the Internet Movie Data Base, social networks, biological
networks, the WWW, etc.
Interestingly, many such complex networks share features with it. For example,
the `six degrees of separation' paradigm states that social networks are small
worlds, and many related complex networks are as well.
[This is joint work with Gerard Hooghiemstra, Shankar Bhamidi, Piet Van Mieghem,
Henri van den Esker and Dmitri Znamenski.]
Yurij Holovatch
Collective behaviour in complex networks: scaling and beyond
In collaboration with C. von Ferber (Coventry/Freiburg), R. Folk (Linz), R. Kenna (Coventry), V. Palchykov (Lviv).
Phase transitions and critical behavior in complex
networks currently attract much attention because of their unusual features
and broad array of applications, ranging from socio- to nanophysics. The
questions we address in this report concern two fundamental principles of
critical phenomena: universality and scaling. Both of these questions have
to be reconsidered when a system resides on a network. To this end, we
consider several simple models on scale-free networks and analyze their
critical behavior in terms of scaling functions which are of fundamental
interest in the theory of critical phenomena. We obtain general scaling
functions for the equations of state and thermodynamical functions extending
the principle of universality to systems on scale-free networks and
quantifying an impact of fluctuations in the network structure on critical
behavior. Moreover, we address the logarithmic corrections to the leading
power laws governing thermodynamic quantities that appear as the second
order phase transition point is approached. We show the validity of scaling
relations for the new set of the logarithmic correction-to-scaling exponents
and derive new scaling relations for the exponents of logarithmic
corrections, for which these relations were unknown.
1. C. von Ferber, R. Folk, Yu. Holovatch, R. Kenna, V. Palchykov. Phys. Rev.
E (2011), (to appear) [arXiv:1101.3680v1].
2. V. Palchykov, C. von Ferber, R. Folk, Yu. Holovatch, R. Kenna. Phys. Rev.
E, vol. 82 (2010) 011145.
Janusz Holyst
CYBEREMOTIONS – Collective Emotions in Cyberspace
Emotions are an important part of most societal
dynamics. As with face to face meetings, Internet exchanges may not only
include factual information but also emotional information; how participants
feel about the subject discussed or other group members. The development of
automatic sentiment analysis has made possible a large scale emotion
detection and analysis using text messages collected from the web. Here
results of two years studies performed in the frame of EU Project CYBEREMOTIONS (Collective Emotions in Cyberspace) will be presented. The
Project associates nearly 40 scientists from Austria, Germany, Great
Britain, Poland, Slovenia and Switzerland. The results include an automatic
collection and classifying sentiment data in various e-communities, a
qualitative and quantitative sentiment data analysis and data driven
modeling of collective emotions by ABM, complex networks and fluctuation
scaling paradigms, development of emotionally intelligent ICT to ols such
as affective dialog systems and graphically animated virtual agents that
communicate by emotional interactions.
Emergence of collective emotions in cyber-communities will be demonstrated
by applying four different methods and using independent datasets that
include several millions of records: (i) emotional avalanches distribution
observed in BBC blogs, and Digg data; (ii) non-random emotional clusters
distribution observed in Blogs06, BBC Forum, Digg and IRC channels; (iii)
persistent character of sentiment dynamics observed for IRC channels using
the Hurst exponent analysis; (iv) causal sentiment triad distribution found
in Network Motives Analysis.
Stefan Hutzler
Analysis of online betting exchange data for the 2008 Champions League football tournament
(Stefan Hutzler, Stephen J. Hardiman and Peter Richmond(*) School of Physics, Trinity College Dublin, Ireland (*)Complex and Adaptive Systems Laboratory, University College Dublin, Ireland)
Online-betting has become increasingly popular. From our analysis of data for football matches traded at the betting exchange betfair.com, we find that as in financial markets, the probability distributions for the change in the market price (odds) are seen to exhibit fat tails [1]. Statistical differences exist between the returns that occur when the matches are under way (which we argue are driven by match events), and the returns that occur during half-time (which we ascribe to a trader-driven noise) [2].
[1] Hardiman SJ, Tobin ST, Richmond P and Hutzler S
(2011), Distributions of certain market observables in an on-line betting
exchange, Dynamics of Socio-Economic Systems, 2, 121-137.
[2] Hardiman SJ, Richmond P and Hutzler (2010), Long-range correlations in
an online betting exchange for a football tournament, New Journal of Physics,
12, 105001.
Hans Kamermans
Predicting the Past for Present Purposes
Archaeological predictive modelling is a technique to predict, at a
minimum, the location of archaeological sites or materials in a region,
based either on the observed pattern in a sample or on assumptions about
human behaviour.
Archaeologists use predictive modelling for two applications: to gain
insight into former human behaviour in the landscape and to predict
archaeological site location to guide future developments in the modern
landscape. The last application is part of archaeological heritage
management and has been heavily criticized by academic researchers. This
presentation will summarize the critique, discuss the possibilities of
predictive modelling and present some avenues for future research.
Bronislovas Kaulakys, Miglius
Alaburda, Vygintas Gontis and Julius Ruseckas
The inverse cubic distributions from the point process model
The well-identified stylized fact is the so-called inverse cubic
power-law of the cumulative distributions of a number of events of trades and of
the logarithmic price change, which is relevant to the developed stock markets,
to the commodity one, as well as to the most traded currency exchange rates
[1-4].
A simple model, based on the point process model of 1/f noise [5], generating
the long-range processes with the inverse cubic cumulative distribution is
proposed and analyzed.
Main assumptions of the model are:
(i) the restricted additive Brownian motion in time of the inter-event interval
for the frequent events (i.e., for small inter-event time τ) and
(ii) the multiplicative motion of the inter-event time τ(t) with the
multiplicative noise proportional to the intensity of the process, 1/τ(t),
for the large inter-event times [6].
It is shown that the additional Poissonian stochasticity of the events
occurrence time does not influence the main conclusions of the model.
More complex equations for modeling the financial systems using the point
process model and stochastic nonlinear differential equations have been
introduced and analyzed in Refs. [7, 8].
1. P. Gopikrishnan, M. Meyer, L. A. N. Amaral and H. E. Stanley,
Inverse cubic law for the distribution of stock price variations, Eur. Phys. J.
B 3, 139 (1998).
2. X. Gabaix, P. Gopikrishnan, V. Plerou and H. E. Stanley, A theory of
power-law distributions in financial market fluctuations, Nature (London), 423,
267 (2003).
3. R. K. Pan, S. Sinha, Inverse-cubic law of index fluctuation distribution in
Indian markets, Physica A 387, 2055 (2008).
4. G.-H. Mu and W.-X. Zhou, Tests of nonuniversality of the stock return
distributions in an emerging market, Phys. Rev. E 82, 066103 (2010).
5. B. Kaulakys, V. Gontis and M. Alaburda, Point process model of 1/f noise vs a
sum of Lorentzians, Phys. Rev. E 71, 051105 (2005).
6. B. Kaulakys and M. Alaburda, Modeling the inverse cubic distributions by
nonlinear stochastic differential equations, ICNF2011, Toronto, 12-16 June, 2011
(to be published).
7. V. Gontis and B. Kaulakys, Modeling long-range memory trading activity by
stochastic differential equations, Physica A 382, 114 (2007).
8. V. Gontis, J. Ruseckas and A. Kononovicius, A long-range memory stochastic
model of the return in financial markets, Physica A 389, 100 (2010).
Renaud Lambiotte
The Personality of Popular Facebook Users
Social science aims at understanding how large-scale behaviour emerges from the intrinsic properties of a large number of individuals and their pairwise interactions. Contrary to network connectivity, whose organization has been explored in email or mobile phone data, the psychological profile of large-scale populations has not been studied so far. In this work, we have analyzed data from a highly-popular Facebook application that is able to survey a very large number of Facebook users with peer-reviewed personality tests. Based on test results, we study the relationship between network importance (number of Facebook contacts) and personality traits, the first of its kind on a large number of subjects (400,000). We test to which extent two prevalent viewpoints hold. That is, sociometrically popular Facebook users (those with many social contacts) are the ones whose personality traits either predict many offline (real world) friends or predict propensity to maintain superficial relationships. We find that the strongest predictor for number of friends in the real world (Extraversion) is also the strongest predictor for number of Facebook contacts. We then verify a widely held conjecture that has been put forward by literary intellectuals and scientists alike but has not been tested: people who have many social contacts on Facebook are the ones who are able to adapt themselves to new forms of communication, present themselves in likable ways, and have propensity to maintain superficial relationships. We show that there is no statistical evidence to support such a conjecture.
Nelly Litvak
Correlations between power law parameters in complex networks
Correlations in complex networks play an important role in, for instance, robustness of the Internet, a range of an epidemic spread, and in information ranking. Yet, mathematical modelling and analysis of these correlations is a largely unresolved issue. In this talk we study the dependence between in-degree of a node and its ranking score computed by the celebrated Google PageRank algorithm. In a power law network, the PageRank ranking scores appear to follow a power law with the same exponent as in-degree. In this talk we characterize correlations between an in-degree and a PageRank score of a randomly chosen node. The dependencies between power law parameters can be evaluated using the so-called angular measure, a notion introduced in extreme value theory to describe the dependency between very large values of coordinates of a random vector. We use this theory to measure dependencies in Wikipedia, Web, and artificially constructed preferential attachment graphs. The results are strikingly different for the three samples. Next, for an analytical stochastic model, that captures correctly the PageRank power law behavior, we prove that the angular measure for in-degree and PageRank is concentrated in two points. This logically corresponds to the two main sources of high ranking: large in-degree and a high rank of one of the ancestors. However, capturing the correlations observed in real data, remains a challenging open problem.
Juergen Mimkes
The econophysics of autocracy and democracy in the Arab world
The thermodynamic formulation of social laws is
based on the law of statistics under constraints, the Lagrange principle.
This laws is called “free energy principle” in physics and has been applied
successfully to all fields of natural science. In social sciences it may be
applied to collective and individual behaviour of social, political or
religious groups.
In homogeneous atomic systems we find the three states: solid, liquid, gas,
depending on temperature and pressure. In homogeneous social systems we also
find three states: collective, individual, global, depending on standard of
living and social pressure. In homogeneous political systems we find again
three states: autocratic, democratic, global, depending on standard of
living and military pressure. For the 90 biggest countries in the world the
transition from autocracy to democracy is in the range of 2.500 to 4.000 US
$ per capita and a fertility rate below 3 children per woman. These
parameters have been applied to the Arab world.
Erik Postma
Van Gogh's Uncertainty Principle
The presentation provides an overview of our attempts to identify forgeries by digitally analyzing van Gogh's paintings. In our approach, the joint uncertainty of location and spatial frequency, as addressed by Dennis Gabor in 1964, plays a central role.
Araceli N. Proto
Classical, Semiclassical and Quantum Models for Understanding Human Systems
The intense use of Information and Communications Technologies (ICT) into
individuals life make that old problems (like socioeconomic uncertainty, labor
stress, together with new problems as the digital divide and its consequent
inequality, data protection, privacy, security, intellectual copyright ) emerge
strongly, demanding the construction of new paradigms. Interaction among
individuals, companies, governments, occurs great speed, not only between them
but also among themselves. It is then necessary to develop tools for evaluation
and prospecting of the complex dynamics of the IS in all involved areas : social,
psychological, economical, productive, legal, ethical in a systematic way, in
the understanding that the impact of the ICT in Society is irreversible and
unavoidable. Up to this point it seems necessary to understand how individuals,
collections of individuals, political and economical decisors behave considering
that technological advances are faster th an the individuals psychological
adaptation capacities (1). Pointing to the analysis of decision making processes,
we describe some models coming from classical as well as quantum physics to
provide a theoretical framework for at least some aspects of human behavior. The
models are also suitable for mathematical psychology problems. We have been
extensively studied these models, previously to apply them to human systems
(2,3, and references therein).
1)J. R. Busemeyer , Z. Wang, J. T. Townsend, Quantum dynamics of human
decision-making, Journal of Mathematical Psychology 50 (2006) 220-241.2)Decision
making under stress: a Quantum Model approach, C.M. Sarris, A. N. Proto, Volume
2, Issue 2, 2010, Pages 359-373 ISSN - 0974-6811.3)Dynamic Peer-to-Peer
Competition L.F. Caram, C.F. Caiafa, A.N. Proto and M. Ausloos, Physica A 389
(2010) 2628_2636, ISSN:0378-4371.
Milan Rajkovic
Statistical Mechanics of SImplicial Complexes: Spectral Entropy
In this exposition we focus on simplicial complexes (obtained from random, scale- free networks and networks with exponential connectivity distributions) and their persistent homological and cohomological properties. Simplicial complexes may be constructed from undirected or directed graphs (digraphs) in several different ways. Here we consider two of them: the neighborhood and the clique complex. We show how a new branch of statistical mechanics may be introduced which we call statistical mechanics of simplicial complexes. We also explore the topological properties of independent sets corresponding to each type of complex network, and their statistical features. Of special interest are the properties of eigenvalues and eigenfunctions of nonnormalized combiantorial Laplacian. Spectral entropy, obtained from the combinatorial Laplacian, reflects many new properties of a simplicial complex (complex network) and thus has important practical applications. We also derive a different type of entropy from a purely combinatorial aspect and explore its properties. In order to illustrate the advantages of simplicial complex approach over standard graph (networks) approach we present results of the analysis of several social type of networks.
Peter Richmond
House prices in London and Dublin revisited
A few years ago (2006) we analyzed data for house prices in both London and Dublin. An outcome of the analysis was that the then level of house prices would not be sustained and a fall in prices was imminent. How have prices fared since that time? How good was our prediction? In this presentation we review recent developments and update our analysis.
Giulia Rotundo
Complex principal component analysis of directed networks
A principal component analysis of a network implying two disinct communities is presented, in the line of studies on the structural properties of citation networks. The considered communities having markedly different opinions are the Neocreationist and Intelligent Design Proponents (IDP) on one hand, and the Darwinian Evolution Defenders (DED) on the other hand. The eigenvalues of the various whole, intra- and inter-community adjacency matrices are calculated before a Principal Component Analysis (PCA) is performed. The quotations of agents being intrinsically directed and not necessarily reciprocal, the adjacency matrices have complex eigenvalues, whence complex components of the eigenvectors. The PCA technique has thus to be generalized to the complex plane. As in standard cases, only two eigenvalues are selected for further discussion; those having the largest real parts. Polar plots of the corresponding eigenvector components are presented for initiating the discussion.
Stef Scagliola & Franciska de Jong
Enhanced Publication: e-humanities practices and multi-media data
The inevitable ‘digital turn’ in the Humanities has inspired many scholars in the Information Sciences to develop ICT-tools that can be applied in various phases of the research process: searching for data, processing data, sharing research results and presentation of them. Yet applying technological tools without the appropriate ‘mindset’ in which concepts such as data sharing, inter-subjectivity, open source and multi-disciplinarity have become common practice, can be problematic. This is certainly true in a realm where status and credits are inextricably connected to measurements of excellence in a competitive academic market. As a diligent apprentice of ICT-applications in the field of the Humanities, and inspired by what anthropologists term ‘participatory observation’ of the academic ‘tribe’, I I would like to: 1. present some thoughts about the necessary preconditions to change old habits and ingrained conventions in the Humanities, with an emphasis on oral history/qualitative research, and, 2. illustrate my observations with an example of the application of ICT in qualitative research, the so called Enhanced Publication.
Andrea Scharnhorst
Evolution of the Wikipedia category structure
Wikipedia has a category feature, which is in place since 2004, where
users were invited to tag (categorize) articles. Specific pages, so-called
category pages, have been introduced. With help of them relations between
regular articles as well as other category pages can be defined. As a result,
we have two networks, the network between Wikipedia articles and the network
between Wikipedia category pages. Both forming a directed network. In this
paper, we analyze the evolution of the category system in terms of number of
pages and number of links. We re-construct the category link network of
Wikipedia at various time intervals and look into how this network and its
properties evolve in time, especially during the rapid re-organizations.
(Suchecki, K.; Scharnhorst, A.; Akdag-Salah, A.; Gao, C.)
Ingve Simonsen
Persistent collective trend in stock markets
This talk presents empirical evidence for a significant difference in the collective trend of the share prices during the stock index rising and falling periods. Data on the Dow Jones Industrial Average and its stock components are studied between 1991 and 2008. Pearson-type correlations are computed between the stocks and averaged over stock-pairs and time. The results indicate a general trend: whenever the stock index is falling the stock prices are changing in a more correlated manner than in case the stock index is ascending. A thorough statistical analysis of the data shows that the observed difference is significant, suggesting a constant-fear factor among stockholders.
Sorin Solomon
Bosiljka Tadic
Emotions in Social Networks: Data Analysis and Agent-Based Modeling
Emotional reactions underlying social contacts are
expected to play a role in communications on the social networks on the Web.
We present analysis of the empirical dataset that we collected from MySpace
networked dialogs and classified for the emotion content by ANEW methodology
[1]. For theoretical analysis we devised an agent-based model, where the
emotional agents [2] are adapted to the network environment and the rules of
their actions are reminiscent to those of the Web-based social networks,
i.e., with a precise account of each message passed among the agents. Our
results on both the empirical and the simulated data reveal user community
structure and characteristic patterns of emotional behavior of the users (agents)
in the phase space of 2-dimensional emotion variables, arousal and valence
[1: M. Suvakov et al. (in preparation)]. [2: F. Schweitzer and D. Garcia,
EPJB, vol. 77, 597 (2010)].
Cecilia Vernia
Inverse problem for interacting models in
social sciences
How can Mathematics contribute to social science? Is it possible to
describe social phenomena on solid scientific grounds? To address these
questions, I present a mathematical framework for studying collective human
behavior in situations where individuals are influenced by personal goals,
cultural influences and norms, but also by social factors, such as peer pressure
and herding effects. This goal will be achieved by integrating well known
econometric tools with mathematical techniques derived from theoretical physics,
in particular from statistical mechanics. Starting from real data coming from
the Italian national health system and using the inverse problem method, I try
to evaluate how people respond to screening invitations. The considered models
will assess the relative importance of different factors in making a given
choice (e.g. taking a test for the prevention of cancer), including peer-to-peer
influences (family, acquaintance, etc...), cultural heritage, public information
campaigns and so on. The ultimate goal of this study should be, for example, to
determine the optimal allocation of resources for health prevention.
Paul Wouters
Modelling research practices - the case of peer review
In this talk I will discuss two aspects of a running project on modelling the peer review process. First, I will discuss the merits and limitations of agent based modelling compared of other forms of modelling such as mathematical modelling. This will be done from the perspective of a reflexive constructivist framework of research. It is not self-evident how modelling as a methodology can fit within such a framework, since constructivism questions the assumptions that are often taken for granted in simulation research. I will show that models and simulations can nevertheless contribute in important ways to a thoroughly constructivist analysis of science and technology.
Second, I will discuss the state of affairs in modelling the scientific system itself. The presentation will conclude with a few examples from our current research project.
Sally Wyatt
On track: living and measuring everyday complexity
From the perspective of the recent ‘mobilities turn’ in social sciences, everyday life involves mobility – people travel daily to go to work, to shop, to meet friends or go to the dentist. This presentation will present the results of a recent study which developed new concepts and methods for analysing the ways in which people draw upon a range of resources to manage everyday mobility. The analysis builds on insights from time-geography, mobility studies and actor-network-theory to develop a conceptual vocabulary for understanding the dynamic and situated nature of travel in everyday life. The study combines qualitative and quantitative data from a study of hypermobile people in the Netherlands.
Oleg Yordanov
Dynamics of public opinion under different conditions
We study the model for public opinion dynamics in the
presents of inflexible minorities under various conditions. In contrast to the
majority of the citizens, the inflexible Andrei Anghelupeople never change their view on a
particular issue, see references [1,2]. First, we consider in detail the
versions of the model with sizes of the discussion groups four and five, for
which we compute and present full phase diagrams for typical fractions of the
“hard-core” devotees. Next, we introduce a modification of the model which
allows for a variable size discussion groups. Compared to the model of
discussion group size k, the version involves k-1 additional
parameters and a richer behavior. The model is analyzed by a combination of
numerical and analytical methods. Finally, we extend the general sequential
probabilistic model [3] as to incorporate inflexible majorities and present
representative behaviors.
References:
[1] S. Galam, F. Jacobs, Physica A 381, 366-376 (2007).
[2] S. Galam, Physica A 389, 3619-3631 (2010).
[3] S. Galam, Europhys. Lett. 70 (6), 705-711 (2005).
Poster: