Newsletter of HoloGenomics

Genomics, Epigenomics integrated into Informatics:


A Compilation by Andras J. Pellionisz, see Contact, Bio and References here

Any/all IP-related Contact is solely through

Attorney Kevin Roe, Esq. FractoGene Legal Department

155 E Campbell Ave, Campbell, CA 95008

Secured contact to Dr. Pellionisz

regarding Academic, Board, Non-Profit activities:

andras_at_pellionisz_dot_com or cell Four-Zero-Eight - 891- Seven - One - Eight - Seven

Skip to HoloGenomics Table of Contents

Genome is Fractal? - "Yeah, for sure!"

(Eric Schadt, Double-degree mathematician, Ph.D. in Biomathematics, Sept 15, 2014)

listen here

Mendelspod interview with Eric Schadt, Director of the $600 M Institute of Genomics and Multiscale Biology, NYC (Sept. 15, 2014)

Q [Theral Timpson, Mendelspod]: I have read that you are a Ph.D. in bio-mathematics?

A [Eric Schadt, Ph.D. in bio-mathematics]: Yes, bio-mathematics.

Q: I have recently met a Hungarian-American scientist, András Pellionisz, and he says that we need to bring math into biology and genetics and he says that


Do you buy any of that?

A [Eric Schadt]: YEAH, FOR SURE!

[What is the significance of Eric Schadt' confirmation of FractoGene (the utility derived from fractal genome growing fractal organisms)? Credentials of Eric Schadt, with double-degree mathematician, Ph.D. in bio-mathematics, a sterling record of Merck, Pacific Biosciences and now Director of $600 M "Mount Sinai Institute for Genomics and Multi-scale Biology", plus his non-biased (straight as an arrow academic and personal integrity), would be extremely difficult to beat globally to form an independent professional judgement, based on top-command on both bio-mathematics and information theory & technology. Two times seven years after the Human Genome Project (2000-2007 Encode-I first concluding that "Junk DNA is anything but" and the Central Dogmatism proven to be one of the most harmful mistakes "of the history of molecular biology", followed by the wilderness of 2007-2014. From 2000 to 2014 genomics essentially existed "a new science without valid or even universally agreed upon definitions of theoretical axioms". Characteristically, even Eric Lander heralded globally the "nothing is true of the most important fundamental assumptions" (but put in 2009 the Hilbert-fractal of the genome, just two weeks after George Church invited to his Cold Spring Harbor Meeting Dr. Pellionisz). Detractors had to swallow their (sometimes ugly) words - with the only "alternative theory" of "mathematically solid and software-enabling FractoGene" a random sample of metaphors that "genome regulation is turning genes on and off", or that "the genome is a language" (found not to be true twenty years ago, see Flam 1994).

Eric Schadt's academic endorsement of FractoGene consistently goes back to 2010 (if the theory that fractal genotype is found to be experimentally linked to fractal phenotype, "it would be truly revolutionary"). Well, from 2011 compromised fractal globule has been linked by scores of top-notch independent experimental studies worldwide to cancer, autism, schizophrenia and a slew of auto-immune diseases.

What will the "academic endorsement" result in? First, like in case of Prof. Schadt, leading academic centers are likely to gain intellectual leadership to schools of advanced studies, where non-profit applications (see below already over a thousand) are streamlined by a thought leader of non-linear dynamics as the intrinsic mathematics of living systems. Second, (and IP augmented by trade secrets since last CIP in 2007) is likely to result in a for-profit application-monopoly (in force over the US market till mid-March of 2026)]

[Dr. Pellionisz is legally permitted to practice Compensated Professional Services (Analysis, Advisorship, Consultantship, Board Membership, etc) as long as there is no "Conflict of Interest", through Secured Contact (see above).

Communication regarding Intellectual Property of any kind, including but not limited to patents, trade secrets, know-how associated with Dr. Pellionisz must be strictly gated by "Attorney Kevin Roe, Esq. FractoGene Legal Department" (see above)]

Skip to Most Recent News (2014-2012)






2007 Post-Encode

2007 Pre-Encode




The Decade of Genomic Uncertainty is Over

The FractoGene Decade (2002-2012)

Pellionisz' FractoGene, 2002 (early media coverage)

Pellionisz' "FractoGene" patent, priority date 2002, patent issued in 2012 (see 2002 priority date, 2007 CIP filing in Google Patents 8,280,641 , and also recursive fractal iteration utility disseminated in peer-reviewed paper and Google Tech Talk YouTube (Is IT Ready for the Dreaded DNA Data Deluge?), both in 2008, presented in September 2009 in Cold Spring Harbor. The issued patent is in force till late March, 2026. The invention drew utility from RELATING genomic- and organismic fractal properties. "Methods" were as described in the body of application, plus ~750 pages of "Incorporation by Reference" ("should be treated as part of the text of the application as filed", see US Law 2163.07(b)). State of Art Methods beyond CIP of Oct. 18, 2007 are handled as "Trade Secrets", as customary in the strongest combination of Intellectual Property Portfolios.

"Evidence for" and/or "Consistent with"??

As evident from the title of the paper above, authors clearly refer to "evidence". Other, and after the initial decade an escalating number of particular authors of independent experimental investigations consider their results "consistent with" the fractal organization found either in the genome, and/or in physiological/pathological (e.g. cancerous) organism(s).

With the significance of claims rapidly gaining a very different value ("evidence for" becoming extremely precious, while "consistent with" generally regarded as almost meaningless) authors are respectfully requested to clarify their (sometimes unclear or ambiguous) claims if they consider themselves in the valuable category of "providing evidence for" - or almost meaningless "consistent with" general class. Clarification to HolGenTech_at_gmail_dot_com will help proper citation, if any. - Dr. Pellionisz

By 2012, independent researchers arrived at the break-through consensus, overdue since 2002. First, ENCODE 2007, followed by ENCODE 2012 replaced the mistaken axioms of "Junk DNA" and "Central Dogma" by the "nolo contendere assumption" of "The Principle of Recursive Genome Function, 2008" (requiring the experimentally found "nearest neighbor organization" of the Hilbert-fractal of genome, at the later date of 2009). The independent illustration above of both the genome as well as organisms exhibiting fractal properties put the challenge plainly in their RELATION. Methods, as e.g. relating genomic fractal defects to the fractality of tumors in the genome disease of cancer, constitute secured intellectual property:

Eric Lander (Science Adviser to the President and Director of Broad Institute) et al. delivered the message
on Science Magazine cover (Oct. 9, 2009) to the effect:

"Mr. President; The Genome is Fractal !"

"Something like this (disruptions in the fractal structures leading to phenotypic change)" were shown to be true (starting in 2011 November, see top-ranking independent experimentalist's publications, cited below).

"Yeah, of course" - it is now "truly revolutionary".

There are only two question for everyone:

(a) "What is in it for me?"

(b) "What is the deal?"

Proof of Concept (Clogged Fractal Structure Linked to Cancer) was already available
at the Hyderabad Conference (February 15, 2012)
Dozens of additional Independent Experimental Proof of Concept Papers were cited in
Hyderabad Proceedings

The genome is replete with repeats. If the fractal structure is compromised
(see laser beam pointing at where the "proximity" is clogged)
syndromes are already linked to cancer(s), autism, schizophrenia, auto-immune diseases, etc.

Table of Contents

(Dec 14) Experts expose fundamental role of chaos and complexity in biological information processing (Genome is a Fractal/Chaotic System governed by nonlinear dynamics [AJP])
(Dec 04) Craig Venter on How the Genomic Era Is Just Starting
(Nov 15) Cancer Genomics: Complexity is in the Eye of the Belwildered
(Nov 07) Exponential Medicine by Singularity University - San Diego, 2014
(Nov 01) TIME Magazine 2003 Summit versus 2014 Special Issue
(Oct 26) Ovarian cancer oncogene found in 'junk DNA'
(Oct 23) Priority Health Becomes First Health Plan to Cover Foundation Medicine's Tests
(Oct 20) ASHG: Data Science to Help Genomics Move from 'Artisanal' to 'Factory' (Google? IBM? Apple? GE Health? Microsoft? ... or Fast and Furious?)
(Oct 10) Mount Sinai Opens New Genomics Lab (in Brentwood, Connecticut) with Bank of Ion Torrent Sequencers
(Sep 15) NCI, NVIDIA Providing $2M for Omics-based Cancer Data Research
(Sep 12) Fractal Genome yields 389,000 hits - here is a list of over a thousand FRACTAL items in PubMed
(Sep 11) 23andMe aims to be Google for genetic research
(Sep 10) Mayo Clinic, IBM Collaborate to Match Patients with Clinical Trials
(Sep 06) Venter steals top scholar from Google
(Sep 06) End of Summer - Beginning of a New Era
(July 01) Genome-based Personalization Industry
(May 22) De-Bullshitting Big Data (Stanford-Oxford, USA-Asia) - says Greg Kovacs
(May 8) Your Genes Are Obsolete
(Apr 30) Small non-coding RNAs could be warning signs of cancer
(Apr 14) The Brave New World of Medicine
(Apr 09) Russia Wakes UP to Genomics
(Mar 31) Dr. Erez Aiden named newest McNair Scholar at Baylor College of Medicine
(Mar 30) Google invites geneticists to upload DNA data to cloud
(Mar 22) Getting Cancer Wrong [Newsweek; cancer cries for mathematics of nonlinear-dynamics]
(Mar 22) Fractal geometry to help diagnose cancer [Fractals in India]
(Mar 22) Designing Fractal Nanosctructured Biointerfaces for Biomedical Applications [Fractals in China]
(Mar 21) The New York Genome Center and IBM Watson Group Announce Collaboration to Advance Genomic Medicine
(Mar 18) A novel mechanism for fast regulation of gene expression
(Mar 15) S. Korea announces $540 million post-genome project
(Mar 12) Hacking Your DNA
(Mar 12) Really? One does not understand the genome by having sequenced it all?
(Mar 04) No longer junk: Role of long noncoding RNAs in autism risk
(Mar 04) Craig Venter's Latest Startup Gets $70 Million to Sequence Loads of Genomes
(Mar 02) GE Ventures support RainDance’s vision to make liquid biopsy a commercial reality
(Mar 01) Google Launches Genomics Effort, Joins Global Alliance
(Feb 25) Google Could Disrupt These 3 Medical Industries Within 10 Years
(Feb 19) Sony’s new genome analysis company is not playing catch-up with Calico
(Feb 16) SAP Pushes Easy-to-Use Software at Hub to Compete With Facebook
(Feb 15) Genome Interpretation Appliance; Designed in California, Develped in Europe & Mexico, Manufactured in Asia?
(Feb 07) Patent War-Weary Samsung Inks Cross-Licensing Deal with Cisco
(Feb 06) These 3 Tech Titans Could Revolutionize Health Care
(Feb 02) ILLUMINA Claims New Sequencer Transcribes 18,000 Genomes per Year at $1,000 Each
(Jan 26) LincRNA, once believed useless, plays role in genome
(Jan 24) Sony forms genome analysis company [with Illumina] in move towards personalized medicine
(Jan 18, 2014) Pulse of J.P. Morgan 2014: Interviews with 17 biopharma execs
(Dec 31, 2013) Eric Schmidt's 2014 predictions: big genomics and smartphones everywhere
(Dec 31) We should be looking at a complete paradigm shift
(Dec 30) "The Last Mohican" of "Junk DNA" Obsolete Axiom
(Dec 12) Breakthrough Towards Deciphering Genomic Code of Life is Likely to Come from Mathematics
(Dec 06) 23andMe to only provide ancestry, raw genetics data during FDA review
(Dec 05) Fractal Genome and its Chaotic Regulation?
(Dec 02) Gates Foundation to Double Donation [Bill Gates and Francis Collins]
(Nov 26) FDA warns Google-backed 23andMe to halt sales of genetic tests
(Nov 20) Mandelbrot, the Genius of Fractals - Born 89, Passed Away 3 Years Ago
(Nov 21) Would He Do It Again?
(Nov 17) Junk Genes Of Protein Codes Might Be Helpful To Understand Cancer
(Nov 07) HolGenTech Board of Advisers News - in Perspective
(Nov 07) The Genome Is Fractal - Distance function through loops of "Recursive Genome Function"
(Oct 27) Junk DNA as "Doing Nothing" became a Laughing Matter - Fractal Recursive Genome Function is the Prevailing Paradigm
(Oct 18) Improving Genome Interpretation ("Genomics is Mired in Misunderstanding")
(Sep 18) All Bets Are Off: Silicon Valley Goes For IT Google-Style!
(Sep 01) Most Assumptions in Molecular Biology are Wrong - Mattick says
(Aug 28) Genomic Dreams Coming Through in China
(July 31) Foundation Medicine seeks $86.3M IPO amid chase for reimbursement
(June 20) The Impact of Human Genome Program Nearing $1 Trillion - Battelle 2013
(May 23) Interpreting the Human Genome (Pioneers on a Shoestring)
(May 21) Professor John Mattick, Executive Director of the Garvan Institute: "the dam is about to burst"
(May 12) Non-coding RNA .. is acutely regulated .. in schizophrenia..!
(May 10) Metaphores of Fractal Dynamics to Multi-dimensional Systems
(Apr 25) Francois Jacob, Nobelist with Monod and Lwoff in 1965 for Operon gene regulation dies at 92
(Apr 25) Celebrate the Unknowns [Nature COMMENT on the 60th]
(Apr 14) Can Cancer Cells Solve the Puzzle of Junk DNA?
(Mar 29) Grappling with Cancer
(Mar 29) Genomic Science - A Possible Nobel Prize?
(Mar 13) Complete Genomics acquired by BGI of China on Sequoia Capital money from Silicon Valley, USA
(Mar 01) A Genetic Code for Genius? In China, a research project
(Feb 21) Breakthrough Prize on YouTube (and everywhere... spilling to BRIC)
(Feb 20) Breakthrough Prize in Life Sciences ($3 M each, for 5 scientists per year - list of first 11)
(Feb 08) Young Chinese scientists will map any genome
(Jan 24) Non-coding Mutations May Drive Cancer
(Jan 15) DNA pioneer James Watson takes aim at "cancer establishments"
(Jan 08) China, Regulation and Securing the Operating System of Life
(Jan 04) Playing Well with Others [When Industrialization of Genomics is no longer Academic Play
(Jan 01) The Decade of Genomic Uncertainty is Over (R.I.P. 2002-2012)
(Dec 31) National Medal of Science Awarded to Leroy Hood
(Dec 25) [A Compelling Case for Fractal Analysis of Cancer and DNA] Is the Cure for Cancer Inside You?
(Dec 25) The FractoGene Decade (2002-2012)
(Dec 20) Illumina Stock Leaps On Roche Acquisition Reports
(Dec 19) Fractal Organization of Human T-cell
(Dec 17) Land Rush
(Dec 10) 2012 After a Decade of Uncertainty, Genomics is at Crossroads
(Nov 29) Genome Data Analysis Summit, San Francisco
(Nov 25) Architecture Reveals Genome’s Secrets
(Nov 23) Thanksgiving - with blessings counted it is time to address "horror vacui"
(Nov 23) Formula Unlocks Secrets of Cauliflower's Geometry
(Nov 12) Szentagothai Centenary Tribute in New York City
(Nov 14) Software developers analyzing patterns to boost odds against cancer
(Nov 10) Genomics Industrialized. Patents Drive Innovations
(Nov 07) FractoGene Emerges in a Global, Scalable Business Model, Protected IP and an Avalanche of Proof of Concept Results
(Oct 08) FractoGene Patent Licensed to Seven in Southern California in its First Week
(Oct 02) US Patent Office Issues FractoGene Patent to HolGenTech Founder Pellionisz
(Sep 06-15) ENCODE Gets out from a 40-Year Dead-End; up to a New Era of Global Industrial Genomics
(Aug 15) What Went Wrong and When?
(July 04) A Quest for Clarity [or Understanding?]
(Jun 26) Summer Solsctice in Industrialization of Genomics
(Jun 10) The Lost Decade; Too Many Genomic Melt-downs (We Could Do Better)
(May 30) Chromosome structure fractal defects implicated in cancer
(May 18) In half a year, third independent experimental Proof of Concept that fractal defect of Copy Number Variation is implicated in cancer
(Apr 22) A Decade after Genomics was Declared to be Informatics; Vistas by Andras Pellionisz (Part II)
(Apr 18) Well, This Is Awkward
(Apr 12) Moore's Law versus Javon's Law: Drawning in the Dreaded DNA Data Deluge is mandated by law, unless we change for better algorithmic architectures
(Apr 09) Just in Weeks, Second Independent Experimental "Proof of Concept" of Fractal Defects Causing Cancer
(Mar 14) Recent CDx, NGS Deals Signal Siemens' Increasing Interest in Genomics Arena
(Mar 02) The Creative Destruction of Medicine
(Feb 29) A scalable global business model and a (sub)continent offers support of Pellionisz' Fractal Approach
(Feb 06) Roche’s Illumina Bid May Spur Buying in DNA Test Land Grab
(Jan 31) Genome Informatics Globalized - USA - China - Korea - India
(Jan 14) UNESCO’s Memorial Year Honours János Szentágothai
(Jan 02) Power in Numbers [Eric Lander: math must grab genomics, beneath neuroscience – AJP]
(Jan 01) Paired Ends: Lee Hood, Andras Pellionisz - 2012 a Year of Turing and theYear of Turning
(Dec 27) The genetic code, 8-dimensional hypercomplex numbers and dyadic shifts
(Dec 10) Biophysicists Discover Four New [Fractal - AJP] Rules of DNA 'Grammar
(Dec 07) Francis Collins [Head of NIH of the USA in Bangalore, India -AJP]
(Dec 06) Paired Ends: Lee Hood, Andras Pellionisz
(Dec 03) ELOGIC Technologies (Bangalore) to launch Genome Analytics Service (See Binet - Genome)
(Dec 02) DNA Sequencing Caught in Deluge of Data
(Dec 01) The FDA’s confusing stance on companies that interpret genetic tests is terrible for consumers
(Nov 30) ELOGIC Technologies Private Limited Bangalore, India [Pellionisz unites Silicon Valley-s of USA/India - AJP]
(Nov 29) The Search for RNAs
(Nov 28) High order chromatin architecture shapes the landscape of chromosomal alterations in cancer ["Fractal Defects" as root causes of cancer -AJP]
(Nov 14) Recursive Genome Function of the Cerebellum. Geometrical Unification of Neuroscience and Genomics
(Nov 12) Altered branching patterns of Purkinje cells in mouse model for cortical development disorder
(Oct 31) Geometric Unification of Neuroscience and Genomics
(Oct 27) ...Initially Jobs sought alternatives to surgery
(Sep-Oct) Bio-IT World; Savoring an NGS Software Smorgasbord
(Sep 23) A DNA Tower of Babel
(Sep 10) Sam Waksal, Pfizer Venture Investments, and More: Moderator Looks Forward to All-Star Chat at New York Life Sciences 2031
(Sep 05) Lost In Translation? Andy Grove blasts "Change the System!" in his Anti-Medical School Course at UC Berkeley
(Sep 03) W.M. Keck Foundation awards Jefferson scientists with $1M medical research grant [Rigoutsos - AJP]
(Sep 02) Samsung Launches Genome Analysis Service, Offers Free Genome
(Aug 20) Comment by Andras J. Pellionisz to New York Times "Cancer's Secrets coming into Sharper Focus"
(Aug 17) Everything Scientists Thought They Knew About Cancer Might Be Totally Wrong
(Aug 15) Spit and know your future [This time, for India... AJP]
(Jul 31) Researchers uncover a new method of checking for skin cancer
(Jul 17) A surge of top-quality papers pointing into "methylation-defects" as predicted by FractoGene as culprits for cancer
(Jul 21) How accurate is the new Ion Torrent genome, really? [Gordon Moore sequenced twice - AJP]
(Jul 21) [Former Intel President] Grove Backs an Engineer’s Approach to Medicine
(Jul 20) Loophole found in genetic traffic laws [Experimentaly Proven Death Certificate of Crick's "Central Dogma" - AJP]
(Jul 16) Editing the genome - Scientists unveil new tools for rewriting the code of life
(Jul 15) Clue to kids' early aging disease found [The Colossal Paradigm-Shift - AJP]
(Jul 14) Researchers Use Genome Editing Methods to Swap Stop Codons in Living Bacteria
(Jul 09) The Mathematics of DNA [is Fractal - says Dr. Perez]
(Jul 08) Cell Surface as a Fractal: Normal and Cancerous Cervical Cells Demonstrate: Different Fractal Behavior of Surface Adhesion Maps at the Nanoscale
(Jul 08) China genomics institute outpaces the world
(Jul 06) Searching For Fractals May Help Cancer Cell Testing
(Jul 01) A quest for better genetics [from Moscow...]
(Jun 30) Study Suggests Widespread Loss of Epigenetic Regulation in Cancer Genomes
(Jun 27) "So What?" - if you separate Fractal Defects from Structural Variants of Human Diversity? - In Vivo Genome Editing!
(Jun 24) 23andMe-Led Team Reports on Findings from Web-Based Parkinson's GWAS
(Jun 23) Researchers Develop Methylation-Based Model for Predicting Age from Spit DNA
(Jun 20) Goodbye, Genetic Blueprint
(May 30) Cells may stray from 'central dogma'
(May 23) The fractal globule as a model of chromatin architecture in the cell
(May 22) The Principle of Recursive Genome Function: Quantum Biophysical Semeiotics clinical and experimental evidences
(May 22) The Myth of Junk DNA - an issue fallen from science in 2006 to a rejected ideology for the masses to chew on as an Amazon bestseller
(May 16) Eric Schadt Joins Mount Sinai Medical School [Dir. of Institute for Genomics AND Multiscale Biology]
(May 12) Battelle Study: The $796 Bn Economic Impact of the Human Genome Project
(May 08) In an improbable corner of China, young scientists are rewriting the book on genome research [Newsweek]
(May 04) Systems Biology 'Makes Sense of Life' [Once the System is Identified - AJP]
(May 01) Virginia Tech partners with NVIDIA to “Compute the Cure” for Cancer
(Apr 30) Breast cancer prognosis goes high tech [Fractal - AJP]
(Apr 16) FractoGene (2002) and Fractal Frenzy set off by The Principle of Recursive Genome Function, YouTube (2008) [AJP]
(Apr 12) The Structural Struggle - [vs. Fractal Algorithmic Elegance - AJP]
(Apr 11) Cancer center builds Texas-sized cloud [private cloud!]
(Apr 11) Cancer as Defective Fractal Recursive Genome Function (Pellionisz and Lander et. al trigger escalation of fractal approach)
(Apr 07) The Trouble with Genes [Article Gets Prize - Mattick joins leaders to admit that basic premises were all wrong - AJP]
(Mar 27) Eric Schadt Extreme Science [and other kooks - AJP]
(Mar 25) Global Scaling Institute of Germany explores roots of fractals with Euler
(Mar 24) Avesthagen launches Whole Genome Scanning [India blows away FDA -AJP]
(Mar 15) Complementing Private Domain Genome Sequencing Industry - the Birth of Genome Analytics Industry
(Mar 15) Scientists need new metaphor for human genome [or better yet, science for industrialization of the new paradigm - AJP]
(Mar 15) RNA regulation of human development, cognition and disease (Mattick in Dubai)
(Mar 14) Hamdan Bin Rashid to inaugurate HGM 2011 Monday
(Mar 01) DRC Computer Invites Dr. Andras Pellionisz to Advisory Board
(Feb 20) NHGRI Celebrates Tenth Anniversary of Human Genome Sequence [what went wrong - Green]
(Feb 19) Initial impact of the sequencing of the human genome [what went wrong according to Sci. Advisor of the President]
(Feb 14) Primates' Unique Gene Regulation Mechanism: Little-Understood DNA Elements Serve Important Purpose

For archived HoloGenomics News articles see Archives above

Latest News

Experts expose fundamental role of chaos and complexity in biological information processing. (In life, Fractal/Chaotic nonlinear dynamics governs [AJP])

Published on December 14, 2014 at 7:15 AM

Do chaos and complexity play a fundamental role in biological information processing?

[2012: Nicolis, Mandelbrot passed away. Encode-2 finally discarded junk/gene "frighteningly unsophisticated" dogmas. While science was introduced in 1989, utility of Fractogene "fractal genome grows fractal organisms" went into force in days (after a Decade of waste), 8,280,641. Clinical application global path was laid down also in 2012 (prized by India), collaborative Springer textbook chapter on science with ample references to the new school submitted also in 2012. FractoGene is in force till late-March of 2026]

The interdisciplinary approach to problems that till recently were addressed in the hermetic framework of distinct disciplines such as physics, informatics, biology or sociology constitutes today one of the most active and innovative areas of science, where fundamental issues meet problems of everyday concern.

John Nicolis, an eminent Greek scientist and thinker who passed away unexpectedly on the 20th of April 2012, brought out to the highest degree the interdisciplinary approach to key scientific problems and, at the same time, their cultural dimension. Complexity has been at the core of his interests for almost 40 years. He imprinted on it a new direction focusing on the generation and processing of information in hierarchical systems, that is to say, systems involving coexisting components evolving on different scales and coupled to each other in a nonlinear fashion through positive and negative feedback loops. He defined three basic levels of organization:

-The ``syntactical'' level where the elementary dynamical processes are taking place.

-The ``semantic'' level, where relationships between stimuli impinging on a system and the formation of "categories" related to global, collective properties e. g. the attractors that emerge from the syntactical level and follow a dynamics of their own.

-The ``pragmatic level'', where different hierarchical systems are viewed as players communicating dynamically via a set of selected strategic rules such as cooperation, competition, cheating, etc.

Based on this vision John Nicolis generated a phenomenal number of ideas and intuitions, In the present volume surveys by eminent international specialists of the mechanisms presiding in information processing and communication are provided. Physical, biological and cognitive systems are approached from different, complementary points of view using the unifying methods of nonlinear dynamics, chaos theory, probability and information theories and complexity science. Unexpected connections between these disciplines are stressed by bringing together ideas and tools that had so far been developed independently of each other. Epistemological issues in connection with incompleteness and self-reference are also addressed.

The following topics are featured in the volume.

I. Glimpses at nonlinear dynamics and chaos:

Nonlinear dynamics and chaos theory provide the general setting within which complexity and information processing can be formulated. In the opening chapter by G. Contopoulos et al the transition from quantum to classical behaviour is analysed in the paradigmatic case of the scattering problem. The connection between classical and quantum descriptions is further addressed in the chapter by M. Axenides and E. Floratos, where the classic Lorenz attractor is revisited using a formulation originally developed by Nambu in the context of quantum mechanics. G. Tsironis et al discuss in their chapter the onset of spatio-temporal complexity in nonlinear lattices. In the chapter by D. Mac Kernan a systematic probabilistic approach is outlined based on coarse-grained description and symbolic dynamics. Symbolic dynamics is taken up again in the closing chapter of this Part by A. Shilnikov et al., where fractal-hierarchical organizations of the parameter space of Lorenz-type chaotic systems induced by homoclinic and heteroclinic bifurcations are revealed using a binary representation of the solutions.

II. Chaos and information:

Information theory finds its origin in Shannon's 1949 classic paper. In the opening chapter of this Part H. Haken develops the quantum expression of Shannon information along with an extension of Jaynes' maximum entropy principle into the quantum domain. The conditions under which entangled states and long-range coherence can be secured as necessary conditions for information processing at the quantum mechanical level, are addressed in the following chapter by S. Nicolis. A dynamical approach to information is subsequently developed in the chapter by C. Nicolis, devoted to nonlinear systems giving rise to multiple simultaneously stable states and to stochastic resonance. Different signatures of multistability and of stochastic resonance on a hierarchy of entropy-related quantities characterizing the system as an information processor are identified. Finally, in the closing chapter by W. Ebeling and R. Feistel the origin of information processing is addressed in relation to the origin and evolution of life. Central to their approach is the idea that there exists a universal process of self-organized emergence of systems capable of processing symbolic information. They coin to it the name of "ritualization transition" and discuss its status with respect to kinetic phase transitions familiar from physics.

III. Biological information processing

Undoubtedly Information processing and the very concept of Information, for that matter, find their most exciting expressions in living matter. In the opening chapter of this Part, P. Schuster addresses the information processing mechanisms responsible for the build up of an evolutionary memory within a population. The conditions under which optimality can be achieved are also analysed using computer simulations along with mathematical modelling, and connections to nonlinear dynamics and irreversible thermodynamics are suggested. Evolutionary arguments are also central in the chapter by Y. Almirantis et al, where the structure of the genome and, in particular, the distribution patterns of the distances between different groups along it are explored and correlated with known evolutionary phenomena. The ubiquity of power law behaviours is established and a model based on aggregative dynamics capable of reproducing these patterns is proposed. Pattern formation on a much larger scale associated to embryonic development is considered in the closing chapter by S. Papageorgiou. A biophysical model is proposed to explain the appearance of a sequential pattern along the anterior-posterior axis of a vertebrate embryo, in coincidence with the 3' to 5' order of the genes in the chromosome.

IV. Complexity, chaos and cognition:

This Part deals with the multiple facets of information processing by the brain, a question that has been at the center of interests of John Nicolis throughout his career. Different approaches to cognition are developed and the status of self-referential processes is discussed. In the opening chapter W. J. Freeman summarises the role of chaos in brain function from a "bottom-up approach". He discusses the state of "criticality" of the celebral cortex and its placid properties of a system at the edge of chaos, an idea that J.S. Nicolis employed in his studies of the mechanisms of cognition in the brain as a hierarchical system. W.J. Freeman, offers a discussion on the state of the art of the issue of brain waves, emerging patterns, fractality, quantum-field considerations and the role of noise in the emergence of coherent and intermittent states in brain dynamics. The spectral properties of brain recordings are studied in comparison with the spectral signatures of a Lorenz-type model, at its turbulent regime, by Provata et al. Bringing out the relevance of chaotic dynamics in understanding the phenomenology of brain recordings. F.T. Arrechi, is addressing the issue of cognition and language with respect to brain dynamics in a hierarchical system perspective via a "top down" approach. He proposes a quantum-like model where apprehension, judgment and self-consciousness could be discussed. He shows that the uncertainty in the information content of spike-train recording is ruled by a "quantum" constant, that can be given a numerical value depending on the specifics of the experimental set up. Subsequently, K. Kaneko presents a tantalizing approach to bridge the gap between dynamical systems and biological information processing. Chaotic itinerancy in high-dimensional dynamical systems, induced switches of states, and interference between slow and fast modes via "super-selection rules", also a preoccupation of J. S. Nicolis, are reviewed and applied to cell differentiation, adaptation, and memory. The necessity to expand the mathematical framework to include self-referential dynamics for such "super selection rules" is also stressed. Closing this part, I. Tsuda, discusses self-reference and chaotic itinerancy with relation to the dynamics of cognition, perception ambiguity and paradoxical games from a purely dynamical-systems point of view.

Related Stories

Scientists find a new strategy for brain cancer treatment

Research finding offers potential therapeutic avenues for glioblastoma

Study reveals way to alleviate memory deficits for Alzheimer's disease patients

V. Dynamical games and collective behaviours:

Continuing on the theme of games C. Grebogi and coworkers address the outstanding and fundamental problem of species coexistence. They approach this problem augmenting evolutionary games with mobility for the species dynamics, under cyclic competitions, which enables them to elucidate the underlying fundamentally nonlinear mechanisms. The emerging picture is one of a complex, non-trivial, chaotic landscape for coexistence and extinction. The emergent properties of the collective behaviour of animal groups is the theme that follows where T. Bountis et al study collective behaviours and phase transitions in models of bird flocking. With an emphasis on the interplay between topological and dynamical constrains present in the complex interactions of the constituent parts, they discuss the emerging complex patterns of motion. In the same vein but with another biological model of social animal behaviour, that of ants, S.C. Nicolis presents evidence of fractal scaling laws in the ubiquitous activity of animal construction. Fractal scaling laws have also been associated with the underlying process of self-organized criticality, a theme that John Nicolis was enthusiastically and frequently discussing in his work and teaching. Y.-P. Gunji offers his view of an extended self-organised criticality in asynchronously tuned cellular automata. He provides a link with dynamical games based on cellular automata, distributed in space, and demonstrates the subtleties in information flow of synchronous versus asynchronous updating for their local states. He demonstrates that asynchronous information updating is the main formative cause of self-organized criticality.

Closing the volume, O. E. Rössler dialogues here with John Nicolis and invites us in a journey on the theme of scientific revolution crossing space-time barriers, from Heraclitus to Hubble.

The general approach followed and the ideas put forward in this volume will prove useful to students, researchers and the general public attracted by the interdisciplinary approach to science. .

Quotes From the Book:

"Mapping the continuous description of a system into a discrete set of states also means that the original, fine grained dynamics induces a symbolic dynamics describing how the sequence of letters from an alphabet unfold in time. This provides a natural link with the information theory view of chaos pioneered by John Nicolis." Chapter 4, Coarse Graining Approach to Chaos, by Donal MacKernan.

"Our interest [...] is focused on the self-organization of information, on the way how a physical system can be enabled to create symbols and the related symbol-processing machinery out of ordinary pre-biological roots."

Chapter 9, Selforganization of Symbols and Information, by Werner Ebeling and Rainer Feistel.

"The late Professor J.S. Nicolis always emphasized [...]the relevance of a dynamical systems approach to biology. In particular, viewing the genome as a "biological text" captures the dynamical character of both the evolution and function of the organisms in the form of correlations indicating the presence of a long-range order. This genomic structure can be expressed in forms reminiscent of natural languages and several temporal and spatial traces left by the functioning of dynamical systems: Zipf laws, self-similarity and fractality".

Chapter 11 Long-Range Order and Fractality in the Structure and Organization of Eukaryotic Genomes, by Dimitris Polychronopoulos, Giannis Tsiagkas, Labrini Athanasopoulou, Diamantis Sellis and Yannis Almirantis.

"In the last decades a wholistic approach has emerged aiming at explaining the complex phenomena of life. In this direction different branches of Science like Chemistry, Physics, Mathematics have contributed. Systems Biology consists of this inter-disciplinary field where the development of powerful computational techniques plays a fundamental role. This new field aims at discovering emerging properties at the level of cells, tissues, organisms, populations functioning as a whole system."

Chapter 12, Towards Resolving the Enigma of HOX Gene Collinearity, by Spyros Papageorgiou

[A recent lucid but deep insight from an undisputable, double-degree biomathematician, director of $600 M "Mount Sinai Center of Genomics and Scale Free Biology" is here , and the late-2014 horse-race status is outlined in the second entry down here ("Cancer Genomics") - AJP]

Craig Venter on How the Genomic Era Is Just Starting

By Craig Venter December 04, 2014

2000 The human genome is decoded for the first time.

There have been lots of stories written about all the hype over getting the genome done and the letdown of not discovering lots of cures right after. The biggest finding when we announced the genome in 2000 was that we only had 20,000-some-odd genes. Some people thought that because we were humans, we had to have more than every other animal. We had to have 10 times more, 100 times more. Everybody wanted there to be this nice, linear path where you have a gene for each trait—you know, there’s a gene that codes for your nose. It’s so naive.

Only now are we really starting to appreciate all the changes that have occurred over the past 15 years. On the computing side and the sequencing side, both of those just passed a cost and performance threshold that will allow for a major impact on medicine. We’re just now where I wanted to be in 2000. In the last three years, we’ve seen some impact on medicine with the discovery of the driver mutations in cancer. If you have lung cancer, the most important thing you can know is your genetic code. Roughly 4 percent of people with lung cancer will have this ALK gene translocation that seems to indicate that Pfizer’s (PFE) drug has a better-than-60-percent chance of shrinking your tumor. Instead of a blockbuster drug that you can give to everybody that has cancer, you end up with a drug that you give to 4 percent of the people we know can benefit from it. That is a fundamental change in medicine.

There’s been a huge increase in the last three years of people measuring these driver mutations, but revolutions in medicine are slow. My understanding is only about 3 percent of cancer patients in the U.S. get genetic screening of their tumors. We have the tools to do it now, but the physicians don’t have the training and the know-how to help get this implemented to benefit their patients. The only way to come up with all of the cures that were promised is to sequence large numbers of genomes. For each gene in your genome, you quite often get a different version of that gene from your father and a different version from your mother. We need to study these relationships across a very large number of people. It’s going to be important to know what the variant is you got from your mother and from your father, and whether that correlates with 30 other variants across the genome that are associated with susceptibility for a certain type of cancer, for example.

That’s why we started my new company Human Longevity. We’re trying to sequence 40,000 genomes in the next six months and then scale up to 100,000 a year. We have a goal of getting to a million genomes by 2020. It’s around $1,000 to $1,500 to do a genome today, but we’re counting on continued Moore’s Law-type change to take it down to a few dollars per genome.

We’ve proven that DNA is our software, the software of cells. But genomics can’t really, truly impact medicine—get us to preventative medicine, get us to new treatments—until we can truly read that software. We have less than 1 percent of the information that people will have in the future. So to me, we’re just starting the genomic era now.—As told to Ashlee Vance

[I am a great admirer of Craig - while I call in my lectures pubicly George Church "the Edison of Genomics", I call Craig Venter "the Tesla of Genomics" - you figure out the difference. As a software & IT specialist, however, I can not fully agree with Venter that the genome is "our software" (elsewhere they call the genome yet another natural language). While we must totally congratulate that Craig now entered the Silicon Valley horse-race of Google Genomics versus Longevity versus Chinese owned Complete Genomics, one must note that nobody can "read that software" who does not understand the (computer) code. Is the genome "written" in Python, C, Fortran, Basic? Obviously not. Do the "language-like features" of the genome make it a "natural language" like Arabic or English? Obviously not. The genome is a code and (contrary to too many claims) has never been "decoded" by merely "sequencing the DNA". You can lay out all the Cyrillic letters of "War and Peace" (in Russsian), yet you can not read it, let alone understand it, without understanding Russian. The genome/epigenome system encodes life by the mathematics of nonlinear dynamics. A bit of complication is, that there is a dual representation (by directly coding but fractured) "genes" and the opposite valence of measurement-type "non-coding" regulatory sequences. You bet on the mathematics. My thesis to beat is that "fractal genome governs fractal growth of organisms" - and the utility is FractoGene - Dr. Pellionisz]

Cancer Genomics

Complexity is in the Eye of the Belwildered

The Dreaded DNA Data Deluge is Deduced by FractoGene to Diagnostic and Therapheutic Utility

Book cover of "Cancer Genomics" (on the left) depicts the oncoming "Zeitgeist" that the recurring waves of cancerous growth not only invoke Mandelbrot's inspiration of coining "FRACTAL" (see on the right the insert from Mandelbrot's "The Fractal Geometry of Nature", page C16). Significantly, the book "Cancer Genomics" invokes the classic artwork of Hokusai, but the original inscription-box is replaced by a double helix, since cancer is universally known as a genome misregulation syndrome (the original Japonese inscription is: "Thirty six views of Mount Fuji / offshore from Kanagawa / Beneath the wave" ). The book "Cancer Genomics" does explicitly say "advanced...analysis methods, such as fractal dimension calculations, can also be applied" (in contrast to Mandelbrot, who deliberatelly stayed away from "mathematization (geometrization) of biology, see his autobiography, "The Fractalist" - the index in his masterpiece "The Fractal Geometry of Nature" does not mention the word "cancer" at all). On the cover of "Cancer Genomics", the upper right insert shows an apparently phased-out, maddeningly complex set of "pathways" ("everything is connected to everything, thus blockage of certain pathways appears not to be the barrier to a Tsunami that just gets around"). The FractoGene concept (Pellionisz, 2002) draws utility (2012) by relating "fractal genome governing growth of fractal organisms". Claims and "know how of Best Methods" (held as "trade secrets after most recent CIP in 2007"). The aim is to relate fractal genome & growth of cancers to better predict, diagnose and treat by precision therapy, based on detecting fractal defects in the genome (prior to development of cancerous protein-structures). Investment & Licensing, as well as Advisership, Consultancy avenues are now open, mail "Letter of Interest" to Attorney Kevin Roe, Esq, contact info shown in top part of this webpage).

Ever since Leroy Hood declared "Genomics=Informatics" (2002), those who never believed the "biggest mistakes of the history of molecular biology" prepared for sequencing AND analytics of Full Human Genomes in droves. This task was predicted a decade ago to cause an earthquake-like "Big One" - a major collision of the tectonic plates of Genomics and Big IT".

Well, it took an entire decade - but just in the USA top IT companies are in a horse-race. Below is (a partial...) list of major contenders, Google Genomics, Amazon Web Services, and IBM - in alliance with genome data sources. (Not belabored here are further USA-based efforts, like Craig Venter competing in Mountain View with Google, at UCSC Dave Haussler warehousing cancer data for NCBI. Foreign competitors of Sony, Panasonic, Samsung in Asia and Siemens, SAP in Europe are treated separately. China, Russia, India (etc) are also joining the fray - potentially altering the ge(n)opolitical equilibrium)


Ambitious Google drive to put human genome online gathers steam

Published time: November 08, 2014

[see YouTube here]

Google’s plan to store entire copies of the human genome online is edging closer to reality. With 3,500 genomes already stored on its servers and more medical institutes jumping onboard, the blueprint of every person on Earth could soon be in the cloud.

The potentially game-changing project, called Google Genomics, has now been quietly moving forward for a year and a half.

Perhaps with so many ambitious plans coming out of the company’s secretive Google X research and development division, from nanobots to sniff out cancer to tremor-canceling spoons for Parkinson's patients, it’s easy for even deeply ambitious projects to get overlooked.

READ MORE: Google’s next data collection project: Human body

RT first wrote about the search giant’s plan to create individual genome databases in July, but even promotional videos hashing out the details of the project attracted just 5,000 views over the past four months.

Maybe the fact that Google Drive currently cannot cope with entire copies of the genome has left many thinking the project is mere speculative pipe dream at this stage of the game.

READ MORE: Google nanobots: Early warning system for cancer, heart disease inside the body

But for futurists, whose first commandment is Moore's Law, that which is impossible today will likely be outdated tomorrow.

Google itself was quick to point out that at the inception of the human Genome project, it took 15 years and $3 billion just to do the first human genome sequence. Today, it can all be done in a day, and for about $1,000.

Just how many gigs am I?

But just how much memory is needed to save all 6 billion of the nucleotide letters that comprise a single genome sequence? Google estimates it’s around 100 gigabits, which might not seem like a lot, until you consider just how many of us there are.

For example, if you wanted to read the DNA of everyone (officially!) living in Moscow, it would take more than 1.2 million terabit hardrives. While that is obviously an enormous amount of information to process, Google’s current search index stands at 100 petabyes – 100,000 terabytes. The average search query, however, takes 0.25 seconds.

And it is applying this self-same search technology to the Google Genomics which is viewed as the key.

At the inception of the project, scientists began hammering out an application programming interface (API) which would allow them to move DNA data into Google server clusters and conduct experiments using the companies renowned web-indexing technology.

And as scientists have expanded their studies beyond individual genomes, hammering out a synthesis between data science and life science could propel the pace of medical advancement over the coming years.

“We saw biologists moving from studying one genome at a time to studying millions,” David Glazer, the software engineer who led the effort and was previously head of platform engineering for Google+, the social network, told the MIT Technology Review. “The opportunity is how to apply breakthroughs in data technology to help with this transition.”

Currently, different genome data sets are exclusively available to specific research labs. The goal then, is to create one centralized database where researchers can compare millions of genome sequences at one time.

Speaking to Technology Review, Sheila Reynolds, a research scientist at the Institute for Systems Biology in Seattle, one idea is to create “cancer genome clouds” where scientists can share information and quickly run virtual experiments as easily as a web search.

“Our bird’s eye view is that if I were to get lung cancer in the future, doctors are going to sequence my genome and my tumor’s genome, and then query them against a database of 50 million other genomes,” Deniz Kural, CEO of Seven Bridges, which stores genome data on behalf of 1,600 researchers in Amazon’s cloud, told the magazine. “The result will be ‘Hey, here’s the drug that will work best for you.’”

But as Reynolds noted, not every research institute has the ability to download a petabyte of data, or the computing power to analyze it.

With a centralized database, however, those technological trammels would be put out to pasture.

The treatment potential of being able to compare the genomes of multiple individuals suffering from the same ailments is astronomical, as is the profit motive for whoever holds the keys to the data locker.

This reality has already put Google, Amazon and Microsoft and IBM in a race to see who will store the data. And on a fair playing field, the competition has driven prices down.

Saving you for a quarter a year

Currently, storing a single human genome with Google is going to cost you $25 a year, in the same ballpark as Amazon. Running analysis of the data, of course, is gonna cost you. The catch, of course, is that people’s DNA is 99.1 percent identical. Once you can whittle it down to the 0.1 percent that makes us who we are, less than a gig will be needed to store the essence of you in the cloud. So in the long term, a bit of analysis and a quarter will get your unique genomic sequence put up in the cloud for a year.

Glazer did tell the magazine just how many customers Google Genomics has now, though at least 3,500 genomes from public projects are already stored on Google’s server farm.

According to The Verge, the National Cancer Institute has already signed on to the project, and has expressed its willingness to pay $19 million to upload copies of its 2,600 terabyte Cancer Genome Atlas to Google Genomics and Amazon’s data center.

The project, however, definitely comes with its privacy pitfalls.

As Gizmodo recently noted, a study in the Journal Science last year showed it was possible to identify several men from the publicly available 1000 Genomes Project based on their Y chromosomes and age, location, and family tree data.

Insurance companies would also likely be thrilled to get their hands on that data.

There is also the issue of whether scientists should tell people if they unknowingly have a rare disease, or have unknown siblings out there in the world.

But while both concerns of privacy and practicality are inevitable in any venture of this scope, the likelihood that the seemingly infinite permutations of AGCT which tell the story of every person on earth seems all but inevitable.

[Can you guess why is the above article on Google Genomics in (the English language broadcast of Russia)??? Hint: think of Google vs. Baidu; BGI of China - missing from this list - is also actively pursuing both sequencing and analytics - but not very publicly... - AJP]


Illumina on AWS - Customer Success Story

[see YouTube on Amazon and Illumina here]

AWS Case Study: Illumina

Biologists around the world use DNA sequencers created by California-based Illumina for a broad range of genomics applications including whole-genome sequencing. The company built its BaseSpace tool on AWS to allow researchers to upload massive data sets directly to the cloud for analysis and to store the results long-term with Amazon Glacier.

Baylor, DNAnexus, Amazon Web Services collaboration enables largest-ever cloud-based analysis of genomic data


Houston, TX - Oct 25, 2013

With their participation in the completion of the largest cloud-based analysis of genome sequence data, researchers from the Baylor College of Medicine Human Genome Sequencing Center are helping to usher genomic scientists and clinicians around the world into a new era of high-level data analysis. (A “cloud” is a virtual network of remote internet servers used to store, manage and process information.)

“The mission of the Baylor Human Genome Sequencing Center is to drive genomics and genomic analysis to be at the leading edge of everything in the field,” said Dr. Jeffrey Reid, assistant professor in the Human Genome Sequencing Center at BCM, who led the BCM portion of the project. “In terms of analysis, the future of genomic research and genomic medicine is in the cloud. We are very much going towards more computing and not less.”

Together with the Platform-as-a-Service company DNAnexus and Amazon Web Services, the largest provider of cloud computing, BCM sequenced the DNA of more than 14,000 individuals -- 3,751 whole genomes and 10,771 whole exomes using next generation sequencing. (An exome contains all the genes in a genome and are the part of the genome that provides the blueprints for proteins.) The individuals whose genetic material was sequenced are part of the Cohorts for Heart and Aging Research in Genomic Epidemiology Consortium or CHARGE project aimed at advancing understanding of human genetics and the contributions to heart disease and aging.

Reid gave a presentation on the project Oct. 25 at the American Society of Human Genetics annual meeting in Boston.

The BCM Human Genome Sequencing Center-developed Mercury pipeline, a semi-automated and modular set of tools for the analysis of next generation sequencing data in both research and clinical contexts, was an integral part of the project. The pipeline identifies mutations from genomic data, setting the stage for determining the significance of these mutations as a cause of serious disease.

Led by Dr. Eric Boerwinkle, professor and director of the Human Genetics Center at The University of Texas Health Science Center at Houston and associate director of the Human Genome Sequencing Center at BCM, the CHARGE project involves more than 300 researchers across five institutions around the world. The cloud-based analysis makes it possible for the large group to have access to an expansive network of data over a server that is HIPAA certified to not compromise patient privacy.

“The collaboration between the CHARGE consortium and the Human Genome Sequencing Center is leading to discovery of those genes contributing to risk of the most important diseases plaguing the U.S. population across all age groups,” said Boerwinkle. “Ultimately, these discoveries forge a path toward novel therapeutics and diagnostics. The use of cloud computing and collaboration with DNAnexus is allowing us to achieve our goals faster and in a more cost-effective manner.”(Boerwinkle will give an updated presentation November 15 at the Cold Spring Harbor Laboratory’s Personal Genomes & Pharmacogenomics Meeting.)

“Having access to this much data was unique,” said Reid. “Many institutions do not have the local compute resources and infrastructure to support large scale analysis projects like this one, so we were lucky to come together with DNAnexus and Amazon Web Services to make this project possible.”

The project required approximately 2.4 million core-hours of computational time, generating 440 TB (terabytes) of results and nearly a petabyte of storage that took place over a four-week period.

By comparison, the 1,000 genomes project sequenced 2,535 exomes and required 25 TB of data.

“It is very important for us to create a centralized space where researchers from all over the world can come and collaborate with the data,” said Reid. “This project creates expansive access to this data over a protected network that will advance research.”


IBM’s Watson takes on brain cancer

[See YouTube on IBM & NYC Genome Center here]

Analyzing genomes to accelerate and help clinicians personalize treatments

New York Genome Center (NYGC) and IBM are collaborating to analyze genetic data to accelerate the race to personalized, life-saving treatment for brain cancer patients.

IBM's Watson cognitive computing system will be designed to analyze the genomic data from a small group of patients diagnosed with glioblastoma, one of the most aggressive and malignant brain cancers. Also the most common type of brain cancer in adults, glioblastoma kills more than 13,000 Americans each year.

IBM and NYGC's computational biology experts are renowned for accelerating life sciences discoveries using deep analytical approaches and next generation information technologies.

The system, expected to be deployed as a cloud-based prototype, will combine modern genomic analytics and comprehensive data bases of medical biomedical literature with Watson’s cognitive computing power to help clinicians uncover individual genetic patterns of glioblastoma. IBM will be taking advantage of NYGC’s genomic and clinical expertise to continue to develop and refine the Watson system with the shared goal of transforming care for all types of cancer, based on the genetic characteristics of that person’s cancer.

With a decade of research and development behind it, the Watson prototype is IBM’s first solution specifically targeted at interpreting data from genomic data. By analyzing gene sequence variations between normal and cancerous biopsies of brain tumors, Watson will then be used to review medical literature and clinical records to help clinicians consider a variety treatments options tailored to an individual’s specific type and personalized instance of the cancer.

Normally, a diagnosis of glioblastoma presents a prognosis of about a year to live, depending on the stage and spread of the cancer at the time. This includes the difficult period of interpreting the best treatment based on the knowledge and information at hand. The Watson system is designed to complement rapid genome sequencing and to help dramatically reduce the time from gathering the genomic data of an individual's tumor variant to clinical interpretation, enabling clinicians to more rapidly make decisions on how they can treat their patients.

Oncologists could use the cloud-delivered system in real-time to analyze genetic data with the intelligent machine curation of comprehensive biomedical literature and drug databases. This analysis can help pinpoint potential therapeutic options that are specific to a patient’s cancer genome, to aid oncologists in their treatment and care decisions.

As these types of intelligent systems become commercially available in medicine, it is expected that many more patients will have access to treatments that are increasingly more tailored to their disease's DNA. The system can continually "learn" as it encounters new patient scenarios and as more information becomes available through new medical research, journal articles and clinical studies.

"As genomic research progresses and information becomes more available, we aim to make the process of analysis much more practical and accessible through cloud-based, cognitive systems like Watson," said Dr. John E. Kelly, Senior Vice President and Director of IBM Research. "With this knowledge, doctors will be able to attack cancer and other devastating diseases with treatments that are tailored to the patient’s and disease’s own DNA profiles. This is a major transformation that can help improve the lives of millions of patients around the world.”

Exponential Medicine by Singularity University - San Diego, 2014

Ray Kurzweil's "Singularity University" kicks off by Sunday (Nov. 9) a landmark meeting in San Diego - now headquarters (not hometown) of Craig Venter.

For some, it may be difficult to make sense of the title - as both "exponential" and "singularity" are clearly mathematical terms, seemingly with not much in common. What is the meeting all about?

To simplify, I would call it in spades "Computerization of Medicine". That explains instantly why are key figures there like Ray Kurzweil (now a Googler...) and Craig Venter (now close to Venter's "hometown" in the Bay Area he launched a bold attempt in Mountain View to compete head-on with Google by an ex-Googler Franz Och, based on the 20 year old notion that the genome is a bit like a natural language). Further, it also may explain why Vinod Khosla, with a subcontinent of IT outsourcing behind him, enters this most interesting new turmoil. Peter Diamandis talks about "exponential thinking" (in plain English "disruption, and the need for bold thinking") - another tell-tale sign.

Kurzweil is no stranger to bold thinking & fractals. His famed book "The Singularity is Near" (2005) shows in the index some seven encounters; look "fractals". Venter is famous among others for his saying "What is the difference between you and God?" - "We had computers!"

Enter Peter Diamandis - the only person among the major movers & shakers mentioned above, who actually holds the M.D. title of a physician (interestingly, he put his M.D. program on hold while at MIT pursued a master's degree in aeronautics and astronautics, and returned to Harvard to complete his M.D.).

Medicine, for ages, had to do very little with mathematics, let alone computers that did not exist until very recent history. The disruption is so dramatic, that one may even want to question why is it inevitable?

The two keywords are "genomics" and "cancer". Today, professional mathematicians (Dave Haussler, Eric Schadt, Eric Lander etc) totally agree that the "digital information of the genome" and the "genome-disease (a.k.a. cancer)" are (separately and especially together) are simply beyond comprehension of the un-aided human mind. The "aid" is clearly by mathematics (software enabling algorithms), implemented by the abundance of computing power.

Breakthrough will not be easy - that is why it is so lucrative. "Medicine" has traditionally been the "human art" to help people with "ill-understood" conditions. Indeed, the very notion (on the cover of Newsweek lately) that "You Can Not Cure a Disease That You Don't Understand"), is enigmatic for those medical doctors who openly confess that most diseases & their medication are "poorly understood".

Steve Jobs hoped (to late for him, too early for science) that cancer will go down in history as the first disease resolved by the power of mathematics and computers.

Most of those presently alive will see it becoming to happen. Surprising? Hardly. Centuries ago, most people died of infections that can be cured by popping antibiotics. I remember to be terrified as a young boy by swimming pools in hot summer days - for fear of polio. In the history of medicine, along came Flaming, Sabin, Salk (etc) and the world has changed.

These days, we live similar times of disruption.

Look up the program of the conference

Join the conference by live stream


TIME Magazine 2003 Summit versus 2014 Special Issue

Time 2003 Monterey 50th Anniversary: Jim Watson for his discovery of the structure of DNA

Time 2014 Special Issue: A Decade Later the issue is not the known structure, but the mathematical function. Sorry, 1% "genes" and 99% "junk" will not do it, anymore. "The more we learn about the genome, the more we learn how complicated it is" (p.12). "Will we ever finally - actually - decode it? Maybe it doesn't matter....You just have to know how to read the map" (p.13). Although FractoGene was introduced at the 2003 Monterey meeting (in the "gene euphoria" noted by select few), like "the splitting of the atom" required Quantum Physics, the recent Newsweek cover ("You can not cure a disease that you don't understand") will require the advanced mathematics of non-linear dynamics of fractal recursion. A similar much earlier but equially massive change was required from Alchemy (trying, in vain, without understanding to turn lead to gold...) to build the new science of physics-based Chemistry.

Ovarian cancer oncogene found in 'junk DNA'

A research team mined junk DNA sequences to identify a non-protein-coding RNA whose expression is linked to ovarian cancer.

Most genetic studies have focused on the portion of the human genome that encodes protein, which is a fraction that accounts for just 2% of human DNA overall. Yet the vast majority of genomic alterations associated with cancer lie outside protein-coding genes, in what traditionally has been derided as junk DNA. Researchers today know that junk DNA is anything but junk, as much of it is transcribed into RNA, for example, but finding meaning in those sequences remains a challenge.

Supported by the Basser Research Center for BRCA at the Abramson Cancer Center at the Perelman School of Medicine at the University of Pennsylvania in Philadelphia, a research team led by Lin Zhang, PhD, built a DNA copy number profile for nearly 14,000 long noncoding RNAs, or lncRNAs. They did this across 12 cancer types, including ovarian and breast cancers that include the two major BRCA-related cancers. They found that the number of copies of lncRNA genes on a chromosome consistently changes in 12 different cancer types. Also, lncRNA genes are widely expressed in cancer cells.

Using clinical, genetic, and gene expression data as filters to distinguish genes whose copy number alteration causes cancer from those for whom copy number changes are incidental, the team whittled down their list from 14,000 to a more manageable number, each of which they systematically tested using genetic experiments in animals. Their study was published in Cancer Cell (2014; doi:10.1016/j.ccr.2014.07.009).

Of the 37 lncRNAs the team fully tested, one known as focally amplified lncRNA on chromosome 1 (FAL1) had all the makings of an RNA oncogene. FAL1 is one of only a handful of lncRNAs to be linked to cancer to date. This knowledge is being applied for clinical applications.

For example, FAL1 expression may be a biomarker of BRCA-related cancer prognosis and the basis of new anticancer therapeutics. As proof-of-principle of the potential efficacy, Zhang's team grew human ovarian tumors in immunocompromised mice, then injected short-interfering RNAs to block the tumors' growth using RNA interference against FAL1. The tumors in treated animals shrank over the course of the experiment, while tumors in control animals continued to grow.

FAL1 is overexpressed in ovarian and breast cancer samples. Blocking the activity of the gene via RNA interference reduces cancer cells' growth, while overexpressing it in normal cells increases their growth. High FAL1 expression in human ovarian cancer samples tended to correlate with poor clinical prognosis.

"This is the first genome-wide study to use bioinformatics and clinical information to systematically identify one lncRNA, which we found to be oncogenic," Zhang said.

FAL1 expression may be able to serve as a biomarker of BRCA-related cancer prognosis, assuming these findings can be validated in other populations. But there also is the potential for new anticancer therapeutics, Zhang said, whether those are therapeutics specifically targeting FAL1 RNA or small molecules that block the interaction between FAL1 and BMI1.

[Only a handful of "Old Schooler" detractors cling to the disarmingly naive and totally non-mathematical (thus for software, useless) primitive notion of Crick (1956) & Ohno (1972) that genome function can ever be explained by less than 1% of the human DNA (tossed aside as "Junk DNA"), and in the DNA>RNA>PROTEIN expression there is "never" a sequence-information recurring from PROTEIN>DNA. The "Genes & Junk" Old School has been dead as a doornail by the first ENCODE, the latest (2007). The Principle of Recursive Genome Function (2008) explains put genome regulation on a mathematical basis (fractal recursive iteration) , where "fractal defects" in the DNA derail recursion and thus FractoGene (fractal DNA grows fractal organisms) opens up the new mathematical science , with the immediate protected utility of detecting "fractal defects of DNA" , popping up in the DNA, yield much earlier (and mathematically precise) diagnosis of the onset of cancerous growth (appearing as tumors later). Also, by matching the full genome (focusing on the 99% of "non-coding" DNA) with genome-tested available therapies, a mathematical matching yields "precision therapy". This vastly improves on the 80% non-effective "trial and error practice" of the many decades since "War on Cancer " by Nixon - who knows how many $100 Bn dollars mis-spent, yet leaving who knows how many hundreds of million patients to one of the most dreadful and expensive "slow turture" - andras_at_pellionisz_dot_com]

Priority Health Becomes First Health Plan to Cover Foundation Medicine's Tests

October 16, 2014

By a GenomeWeb staff reporter

NEW YORK (GenomeWeb) – Priority Health has begun coverage of Foundation Medicine's genomic profiling services for cancer, making the health plan the first in the country to provide such coverage, the companies said after the close of the market on Thursday.

The positive coverage decision is for FoundationOne and FondationOne Heme. FoundationOne interrogates the entire coding region in 315 genes and select introns in 28 genes that are commonly altered in solid tumors. FoundationOne Heme analyzes DNA in 405 genes and RNA in 265 genes that are often altered in blood-based malignancies, sarcomas, and select pediatric cancers. Both services provide information that enable clinicians to more appropriately select therapies for their patients.

During the summer, the New York State Department of Health approved Foundation Medicine to market the tests to residents of the state.

Priority Health Associate Vice President of Medical Affairs John Fox said in a statement that cancers have been historically categorized and treated based on where they are located in the body. "With additional information available about the underlying genomic drivers of a tumor's growth, we're helping our members to access customized treatment options with targeted therapies," he said.

The health plan is based in Grand Rapids, Mich., and has more than 900,000 healthcare providers in its network. According to Priority Health's website, it has 600,000 members.

-Explanation for tests:----------------------------------------------------------------------------------------------------------------------

FoundationOne is a targeted genomic sequencing test for solid tumors that profiles the coding regions of 236 cancer-related genes, as well as 47 introns from 19 other genes for alterations associated with existing and experimental molecularly targeted therapies.

FoundationOne Heme specifically targets hematologic cancers such as leukemia, lymphoma, and myeloma, as well as sarcomas and pediatric cancers to guide treatment options for patients based on the genomic profiles of their cancer. The test involves sequencing of 405 full genes, select introns of another 31 genes, and RNA sequencing targeting 265 genes that are validated molecular targets for therapy or unambiguous drivers of oncogenesis in these cancers based on current knowledge.

[The Business Model of Genome Informatics has just become completed by providing a definitive answer to "where is the money?" Even without USA Insurance System "genome matched cancer therapy is viable (for the rich in the USA) - or for civilized countries where health care is a government supported system with built-in incentives for paying only for drugs that actually work. For what used to be in the US as "sick care" here, where "repeat customers" of cancer patients are "welcome to use as many chemos as possible before they die", the announcement that US Insurance System(s) will become interested in cost-efficacy is huge news. It enlarges the "reimbursed patient-base" in a rapidly escalating manner - and at the same time companies such as Foundation Medicine that leverage MANY Pharma-companies, inevitably trigger a competition among them "which pharma company will excel over competition by genome-testing their products". Foundation Medicine deserves huge credit for making this happen. At the same time, please note that FMI probes only "the coding regions of 236 genes" (though as Fouding Advisor noted, "cancer-related" genes could be potentially as many as all (19,000) human genes. Also note, that presently, for lack of understanding fractal recursive genome function (when misregulated, resulting in cancer), for the 98.7% of the human genome (that is "non-coding"), the leading edge of FMI currently probes only 19 genes (0.1% of all genes) for their "non-coding" (intronic) segments. The incredibly good news is that FMI already on their way to probe for cancer "non-coding DNA" at all. The growth potential is obviously phenomenal, and is phenomenally lucrative. - andras_at_pellionisz_dot_com]

ASHG: Data Science to Help Genomics Move from 'Artisanal' to 'Factory' (Google? IBM? Apple? GE Health? Microsoft? ... or Fast and Furious?)

October 20, 2014

By Ciara Curtin

SAN DIEGO (GenomeWeb) – While genomics has showed promise at a small scale for matching patients to treatments, scaling that capability up so that personalized medicine may be realized for all will require a lot more data to be sifted through, speakers at this year's American Society of Human Genetics meeting said.

"How can we apply those breakthroughs in data technology to help with the transition from world of one to a world of millions?" asked Google's David Glazer, referring to the increasing number of people who have undergone genome sequencing, at ASHG.

Genomics is generating a storm of data, not just in terms of sequencing reads coming off of newer and faster machines, but also in terms of sheer research output as more journal articles showing links between variants and disease are published. At the same time, groups are working on building data standards to facilitate the sharing of clinical and genomic data.

IBM's Ajay Royyuru also noted at ASHG that between 6,000 and 10,000 articles are published a year that mention cancer — an amount that people just can't realistically read, even though researchers and clinicians need to keep up to date to find the best treatment for patients.

"This is a problem that really deserves help," Royyuru said.

The key factors of such a process, Royyuru said, is that it has to be comprehensive and objective as well as scalable and fast. Additionally, he said, it has to be transparent and show the reasoning that led it to its conclusions.

He and his colleagues at IBM are turning to supercomputer Watson to digest those papers and how their findings may relate to patients.

Through the Precision Oncology workflow he and his colleagues developed, patient sequencing data is fed into Watson, which then compares it to what's housed in databases like PubMed, the National Cancer Center's Pathway Interaction Database, and DrugBank, among others. From this, Watson develops a conceptual model of the disease and outputs a set of treatment options. It also provides the reasoning behind choosing those possible therapies that may then be presented to a tumor board, for example.

The process of generating a report takes five to 10 minutes, he said.

Additionally, Royyuru said, Watson would learn from the process as data regarding the treatment given to a patient and that patient's response are fed back in.

Currently this pipeline is a prototype that IBM is working on in conjunction with the New York Genome Center, and Royyuru added that IBM plans on recruiting additional beta testers next year.

In addition to Watson, other data technologies and expertise from the computer science field could be refashioned to analyze genomic data.

Companies like Google have experience working with large amounts of data. For instance, Glazer noted that 100 hours of video are uploaded to YouTube every minute and that the number of Gmail users is 150 times the number of US PhDs.

He and his colleagues also have begun to test their tools — like Dremel and BigQuery —on genomic data from the 1,000 Genomes Project. The first step of a principal component analysis of 1,000 Genomes Project data is to make a similarity matrix, and that takes, he said, about two hours on 60 eight-core machines.

Begin able to quickly build research questions one after the other is part of what's needed for innovation, Glazer added.

Still, he noted that to move genomics and personalized medicine from its current "artisanal" status to "factory" mode, there needs to be better standards. The Global Alliance for Genomics and Health, of which Google and organizations like BGI-Shenzhen, Genome Canada, the US National Institutes of Health, and the Wellcome Trust are a part, are working on developing such standards to improve interoperability and enable data sharing. The group also is working with the Genome in a Bottle Consortium to develop benchmarking references.

Glazer believes these efforts will lead to more innovation to analyze and explore data.

[Glazer was publicly told, even if he did not know before, that innovation to analyze Recursive Genome Function was filed in 2002 August 1st, now in force till 2026 Mid-March, 8,280,641 AJP]

Mount Sinai Opens New Genomics Lab (in Branford, Connecticut) with Bank of Ion Torrent Sequencer

Bio-IT Word (Sept 11)

By Aaron Krol

September 11, 2014 | The old Roche 454 facility in Branford, Connecticut, will soon be returning to genomic science, as the Icahn Institute at Mount Sinai prepares to set up its second sequencing facility on grounds abandoned by Roche in its shutdown of 454 Life Sciences. Mount Sinai announced today that it has leased the space and purchased an initial bank of eight Ion Proton high throughput sequencers. “These instruments are getting installed, and we’ll be up and running next month,” says Glenn Farrell, Director of Mount Sinai’s Department of Genetics and Genomic Sciences.

Mount Sinai already operates one of the world’s leading hospital-affiliated sequencing centers at the Icahn School of Medicine in New York City. Under the leadership of systems biologist Eric Schadt, the Icahn Institute for Genomics and Multiscale Biology has become known both for tackling research studies into human disease on a grand scale, and for its early adoption of genomic testing as a tool in patient care. An Ion Proton cancer hotspot panel used at the Institute was recently approved by the New York State Department of Health for clinical use in refining cancer treatments, and Mount Sinai pursues more ambitious clinical projects under its research umbrella.

The New York City lab where Dr. Schadt works is now space-limited, Farrell told Bio-IT World, and Mount Sinai is establishing a second genetic testing location that can be more easily scaled up as the hospital system’s sequencing capabilities continue to grow. Branford is also home to a number of genetic specialists, making it an attractive setting for the new center. “With the 454 employees, and the adjacency to Yale University, you’ve got a lot of people with this type of skill set to hire,” says Farrell, adding that Mount Sinai is hiring between 20 and 25 scientists to staff the facility.

Among those hires is Todd Arnold, previously the VP for Research & Development at 454, and now the Managing Director of the new genetic testing lab. His team will be immediately joining a major research project already underway at the Icahn Institute: the Resilience Project, which aims to screen hundreds of thousands of healthy individuals for mutations that are predicted to cause devastating genetic disorders, with the goal of understanding factors that protect against these diseases.

“At the Icahn Institute, we’re known for big data, and we want to look at massive amounts of data points and separate the signal from the noise,” says Farrell. “We’re going to see projects, like the Resilience Project, that need massive high-throughput sample handling and analysis done at this facility.” The lab will also be involved in clinical diagnostics for the Mount Sinai hospital system.

Support for Life Tech

The Icahn Institute has designed a custom Ion Proton AmpliSeq Panel to use in these large-scale projects, covering 26,000 amplicons across more than 700 genes. As Eric Schadt explained to Bio-IT World by email, this panel, the largest ever designed for the platform, is intended to have as broad a clinical reach as possible. The genetic loci included in the panel cover all known variants linked to rare Mendelian disorders, as well as numerous gene-drug associations. “We then complemented this data by covering all regions harboring variants that are associated with common human diseases across the entire disease spectrum,” Schadt added, “so for example those loci that have been identified and highly replicated in GWAS [genome wide association studies] and WES/WGS [whole exome sequencing/whole genome sequencing] studies.” The resulting panel should be able to meaningfully contribute not only to the Resilience Project, but also to research on complex chronic diseases like cancer, diabetes, heart disease, and neurological disorders. Schadt also writes that the panel will be able to pick up both single-nucleotide variants, and small indels, which have often been a challenge for short-read technologies like the Proton.

[Branford, Connecticut is an outskirt of Yale University, where the late Prof. Mandelbrot was awarded Sterling Professorship. It is a delight to see Eric Schadt, who approves the fractal approach, "coming home" in more than one sense]

NCI, NVIDIA Providing $2M for Omics-based Cancer Data Research

August 28, 2014

By a GenomeWeb staff reporter

NEW YORK (GenomeWeb) – The National Cancer Institute's Clinical Proteomic Tumor Analysis Consortium (CPTAC) and the NVIDIA Foundation on Wednesday announced that they will provide up to $2 million in funding for the development of omics-based, data-intensive scientific tools to treat cancer.

CPTAC said it anticipates funding up to three awards totaling $1.8 million, while NVIDIA will make one award in the amount of $200,000 for projects aimed at creating tools that will enable the mining and interpretation of large-scale publicly available omics datasets. CPTAC and NVIDIA noted three research areas of interest, including the development of computational tools that use datasets to elucidate cancer biology and to advance the diagnosis, treatment, or prevention of cancer.

The organizations also are seeking projects to integrate new computational omics analysis methods into existing genomic data analysis pipelines to advance cancer biology knowledge and enable researchers to leverage omics advancements in cancer diagnosis and treatment, CPTAC and NVIDIA said.

Also of interest are efforts to develop new multi-omic simulation and/or visualization techniques that make computational biology accessible to researchers who have no programming experience.

The projects will use datasets from The Cancer Genome Atlas and CPTAC, as well as other publicly available omics datasets. Proposals will be assessed based on whether they advance new approaches or apply new insights "through parallel and/or visual computing in the area of computational omics;" whether the project will be used by or benefit multiple researchers and motivate new innovations; and whether the work leverages computational methods to solve a specific problem in computational omics, resulting in a "substantial impact" in cancer research.

Other criteria for funding include whether a project defines clear goals and a milestone-based plan for completion within a two-year time frame, and whether the work identifies obstacles and reasonable approaches to overcome them.

Applications are due Oct. 8.

[Applicants may want to secure the use of FractoGene IP (Patent 8,280,641 and Improvements of Best Methods post-2007, held as Trade Secrets). Should NVIDIA wish to secure that FractoGene IP excludes any "non-GPU" appliance-embedding, needs to match or overbid the soleley FPGA-based exclusive offer. All contacts to Kevin Roe, Esq. Fractogene Legal Department, 155 E Campbell Avenue, Campbell, CA 95008]

Fractal Genome yields 389,000 hits (392,000 a week after) - here is a list of over a thousand FRACTAL items just in PubMed.

My patent 8,280,641 has an original priority date of Aug. 1, 2002. Upon submission, since the utility of "fractal genome grows fractal organisms" ran head-against BOTH of the at the time ruling twin axioms ("Junk DNA" and "Central Dogma" of Nobelist and living Francis Crick), was blatantly dismissed - just like other (one-fold...) paradigm-shifts. For instance, "Jumping Genes" won a Nobel after 40 years of marginalization and ridicule. Never mind opinions. Ask "what is in it for me?" and "what is the deal?" (now that I finally have the patent in force...)

Since my FractoGene could not possibly be published, I filed the utility as a patent. USPTO (though both ENCODE -I in 2007 and ENCODE-II in 2012 shook the world that "the community of scientists must re-think longtime beliefs"), took a leisurely time (over a Decade...) to actually issue my patent. It is now in force till late March, 2026.

Today, Fractal Genome yields 389,000 hits on Google. Just in the top-rated PUBMED, after my priority date, you will find FRACTAL in 1,161 instances:

1. PLoS One. 2014 Aug 8;9(8):e104682. doi: 10.1371/journal.pone.0104682. eCollection


Exhaled aerosol pattern discloses lung structural abnormality: a sensitivity

study using computational modeling and FRACTAL analysis.

Xi J(1), Si XA(2), Kim J(1), Mckee E(3), Lin EB(4).

Author information:

(1)School of Engineering and Technology, Central Michigan University, Mount

Pleasant, Michigan, United States of America.

(2)Science Division, Calvin College, Grand Rapids, Michigan, United States of


(3)College of Medicine, Central Michigan University, Mount Pleasant, Michigan,

United States of America.

(4)Department of Mathematics, Central Michigan University, Mount Pleasant, Michigan,

United States of America.

BACKGROUND: Exhaled aerosol patterns, also called aerosol fingerprints, provide

clues to the health of the lung and can be used to detect disease-modified airway

structures. The key is how to decode the exhaled aerosol fingerprints and

retrieve the lung structural information for a non-invasive identification of

respiratory diseases.

OBJECTIVE AND METHODS: In this study, a CFD-FRACTAL analysis method was developed

to quantify exhaled aerosol fingerprints and applied it to one benign and three

malign conditions: a tracheal carina tumor, a bronchial tumor, and asthma.

Respirations of tracer aerosols of 1 µm at a flow rate of 30 L/min were

simulated, with exhaled distributions recorded at the mouth. Large eddy

simulations and a Lagrangian tracking approach were used to simulate respiratory

airflows and aerosol dynamics. Aerosol morphometric measures such as

concentration disparity, spatial distributions, and FRACTAL analysis were applied

to distinguish various exhaled aerosol patterns.

FINDINGS: Utilizing physiology-based modeling, we demonstrated substantial

differences in exhaled aerosol distributions among normal and pathological

airways, which were suggestive of the disease location and extent. With FRACTAL

analysis, we also demonstrated that exhaled aerosol patterns exhibited FRACTAL

behavior in both the entire image and selected regions of interest. Each exhaled

aerosol fingerprint exhibited distinct pattern parameters such as spatial

probability, FRACTAL dimension, lacunarity, and multiFRACTAL spectrum.

Furthermore, a correlation of the diseased location and exhaled aerosol spatial

distribution was established for asthma.

CONCLUSION: Aerosol-fingerprint-based breath tests disclose clues about the site

and severity of lung diseases and appear to be sensitive enough to be a practical

tool for diagnosis and prognosis of respiratory diseases with structural


PMCID: PMC4126729

PMID: 25105680 [PubMed - in process]

2. Phys Rev Lett. 2014 Jul 25;113(4):046806. Epub 2014 Jul 25.

Anderson localization on the bethe lattice: nonergodicity of extended States.

De Luca A(1), Altshuler BL(2), Kravtsov VE(3), Scardicchio A(4).

Author information:

(1)Laboratoire de Physique Théorique de l'ENS and Institut de Physique Theorique

Philippe Meyer 24, Rue Lhomond, 75005 Paris, France.

(2)Physics Department, Columbia University, 538 West 120th Street, New York, New

York 10027, USA.

(3)Abdus Salam International Center for Theoretical Physics, Strada Costiera 11,

34151 Trieste, Italy and L. D. Landau Institute for Theoretical Physics, 2

Kosygina Street, 119334 Moscow, Russia.

(4)Physics Department, Columbia University, 538 West 120th Street, New York, New

York 10027, USA and Abdus Salam International Center for Theoretical Physics,

Strada Costiera 11, 34151 Trieste, Italy and Physics Department, Princeton

University, Princeton, New Jersey 08544, USA and INFN, Sezione di Trieste, Strada

Costiera 11, 34151 Trieste, Italy.

Statistical analysis of the eigenfunctions of the Anderson tight-binding model

with on-site disorder on regular random graphs strongly suggests that the

extended states are multiFRACTAL at any finite disorder. The spectrum of FRACTAL

dimensions f(α) defined in Eq. (3) remains positive for α noticeably far from 1

even when the disorder is several times weaker than the one which leads to the

Anderson localization; i.e., the ergodicity can be reached only in the absence of

disorder. The one-particle multiFRACTALity on the Bethe lattice signals on a

possible inapplicability of the equipartition law to a generic many-body quantum

system as long as it remains isolated.

PMID: 25105646 [PubMed - in process]

3. J Exp Psychol Hum Percept Perform. 2014 Jul 7. [Epub ahead of print]

Haptic Perceptual Intent in Quiet Standing Affects MultiFRACTAL Scaling of

Postural Fluctuations.

Palatinus Z, Kelty-Stephen DG, Kinsella-Shaw J, Carello C, Turvey MT.

Research on dynamic touch has shown that when a rod strapped to the shoulders is

wielded via axial rotations, flexions-extensions, and lateral bending of the

trunk, participants can selectively perceive whole rod length and partial rod

length (e.g., a leftward segment) with precision comparable to wielding by hand

(Palatinus, Carello & Turvey, 2011). The present research addressed whether this

haptic ability is preserved in quiet standing, when postural control is limited

to center of pressure (COP) fluctuations at the mm/ms scale, and, if so, whether

the intentions ("perceive partial," "perceive whole") are distinguishable within

the fluctuations. Given standard manipulations of rod length and attached mass,

participants provided significantly distinct, appropriately scaled, whole and

partial estimates of rod length. COP displacement time series were subjected to

multiFRACTAL, detrended fluctuation analysis. The resultant spectrum of FRACTAL

scaling exponents for gradually different-sized fluctuations revealed that

"perceive partial" was manifest as larger exponents for progressively smaller

fluctuations than "perceive whole." Our results indicate (a) that the significant

mechanical variables for haptically perceiving object extent are available in the

small scale of normal body sway, and (b) that these seemingly "passive" movements

reflect the intention of the perceiver. (PsycINFO Database Record (c) 2014 APA,

all rights reserved).

PMID: 24999615 [PubMed - as supplied by publisher]

4. Opt Lett. 2014 Jul 1;39(13):3718-21. doi: 10.1364/OL.39.003718.

Experimental confirmation of long-memory correlations in star-wander data.

Zunino L, Gulich D, Funes G, Ziad A.

In this Letter we have analyzed the temporal correlations of the angle-of-arrival

fluctuations of stellar images. Experimentally measured data were carefully

examined by implementing multiFRACTAL detrended fluctuation analysis. This

algorithm is able to discriminate the presence of FRACTAL and multiFRACTAL

structures in recorded time sequences. We have confirmed that turbulence-degraded

stellar wavefronts are compatible with a long-memory correlated monoFRACTAL

process. This experimental result is quite significant for the accurate

comprehension and modeling of the atmospheric turbulence effects on the stellar

images. It can also be of great utility within the adaptive optics field.

PMID: 24978719 [PubMed - in process]

5. Phys Rev E Stat Nonlin Soft Matter Phys. 2014 Mar;89(3):032916. Epub 2014 Mar 24.

Multiscale multiFRACTAL analysis of traffic signals to uncover richer structures.

Wang J(1), Shang P(2), Cui X(3).

Author information:

(1)Department of Mathematics, School of Science, Beijing Jiaotong University,

Beijing 100044, People's Republic of China and Division of Interdisciplinary

Medicine and Biotechnology, Department of Medicine, Beth Israel Deaconess Medical

Center/Harvard Medical School, Boston, Massachusetts 02215, USA.

(2)Department of Mathematics, School of Science, Beijing Jiaotong University,

Beijing 100044, People's Republic of China.

(3)Division of Interdisciplinary Medicine and Biotechnology, Department of Medicine,

Beth Israel Deaconess Medical Center/Harvard Medical School, Boston,

Massachusetts 02215, USA.

MultiFRACTAL detrended fluctuation analysis (MF-DFA) is the most popular method

to detect multiFRACTAL characteristics of considerable signals such as traffic

signals. When FRACTAL properties vary from point to point along the series, it

leads to multiFRACTALity. In this study, we concentrate not only on the fact that

traffic signals have multiFRACTAL properties, but also that such properties

depend on the time scale in which the multiFRACTALity is computed. Via the

multiscale multiFRACTAL analysis (MMA), traffic signals appear to be far more

complex and contain more information which MF-DFA cannot explore by using a fixed

time scale. More importantly, we do not have to avoid data sets with crossovers

or narrow the investigated time scales, which may lead to biased results.

Instead, the Hurst surface provides a spectrum of local scaling exponents at

different scale ranges, which helps us to easily position these crossovers.

Through comparing Hurst surfaces for signals before and after removing periodical

trends, we find periodicities of traffic signals are the main source of the

crossovers. Besides, the Hurst surface of the weekday series behaves differently

from that of the weekend series. Results also show that multiFRACTALity of

traffic signals is mainly due to both broad probability density function and

correlations. The effects of data loss are also discussed, which suggests that we

should carefully handle MMA results when the percentage of data loss is larger

than 40%.

PMID: 24730922 [PubMed - in process]

6. Phys Rev E Stat Nonlin Soft Matter Phys. 2014 Mar;89(3):032814. Epub 2014 Mar 31.

Topological properties and FRACTAL analysis of a recurrence network constructed

from fractional Brownian motions.

Liu JL(1), Yu ZG(2), Anh V(3).

Author information:

(1)Hunan Key Laboratory for Computation and Simulation in Science and Engineering

and Key Laboratory of Intelligent Computing and Information Processing of

Ministry of Education, Xiangtan University, Xiangtan, Hunan 411105, China.

(2)Hunan Key Laboratory for Computation and Simulation in Science and Engineering

and Key Laboratory of Intelligent Computing and Information Processing of

Ministry of Education, Xiangtan University, Xiangtan, Hunan 411105, China and

School of Mathematical Sciences, Queensland University of Technology, GPO Box

2434, Brisbane, Q4001, Australia.

(3)School of Mathematical Sciences, Queensland University of Technology, GPO Box

2434, Brisbane, Q4001, Australia.

Many studies have shown that we can gain additional information on time series by

investigating their accompanying complex networks. In this work, we investigate

the fundamental topological and FRACTAL properties of recurrence networks

constructed from fractional Brownian motions (FBMs). First, our results indicate

that the constructed recurrence networks have exponential degree distributions;

the average degree exponent 〈λ〉 increases first and then decreases with the

increase of Hurst index H of the associated FBMs; the relationship between H and

〈λ〉 can be represented by a cubic polynomial function. We next focus on the motif

rank distribution of recurrence networks, so that we can better understand

networks at the local structure level. We find the interesting superfamily

phenomenon, i.e., the recurrence networks with the same motif rank pattern being

grouped into two superfamilies. Last, we numerically analyze the FRACTAL and

multiFRACTAL properties of recurrence networks. We find that the average FRACTAL

dimension 〈dB〉 of recurrence networks decreases with the Hurst index H of the

associated FBMs, and their dependence approximately satisfies the linear formula

〈dB〉≈2-H, which means that the FRACTAL dimension of the associated recurrence

network is close to that of the graph of the FBM. Moreover, our numerical results

of multiFRACTAL analysis show that the multiFRACTALity exists in these recurrence

networks, and the multiFRACTALity of these networks becomes stronger at first and

then weaker when the Hurst index of the associated time series becomes larger

from 0.4 to 0.95. In particular, the recurrence network with the Hurst index

H=0.5 possesses the strongest multiFRACTALity. In addition, the dependence

relationships of the average information dimension 〈D(1)〉 and the average

correlation dimension 〈D(2)〉 on the Hurst index H can also be fitted well with

linear functions. Our results strongly suggest that the recurrence network

inherits the basic characteristic and the FRACTAL nature of the associated FBM


PMID: 24730906 [PubMed - in process]

7. Biomed Res Int. 2014;2014:320767. doi: 10.1155/2014/320767. Epub 2014 Feb 25.

Streamlining cutaneous melanomas in young women of the Belgian Mosan region.

Hermanns-Lê T(1), Piérard S(2).

Author information:

(1)Department of Dermatopathology, Unilab Lg, University Hospital of Liège, 4000

Liège, Belgium ; Dermatology Unit, Diagnostic Centre, 4800 Verviers, Belgium.

(2)INTELSIG Laboratory, Montefiore Institute, University of Liège, 4000 Liège,


Sporadic cutaneous melanoma (SCM) has shown a dramatic increase in incidence in

Caucasian populations over the past few decades. A particular epidemiological

increase was reported in women during their childbearing age. In the Belgian

Mosan region, a progressive unremitting increase in SCM incidence was noticed in

young women for the past 35 years. The vast majority of these SCMs were of the

superficial type without any obvious relationship with a large number of

melanocytic nevi or with signs of frequent and intense sunlight exposures as

disclosed by the extent in the mosaic subclinical melanoderma. A series of

investigations pointed to a possible relationship linking the development of some

SCM to the women hormonal status including the effect of hormonal disruptors.

These aspects remain, however, unsettled and controversial. It is possible to

differentiate and clearly quantify the SCM shape, size, scalloped border, and

variegated pigmentation using computerized morphometry as well as FRACTAL and

multiFRACTAL methods.

PMCID: PMC3955611

PMID: 24716193 [PubMed - in process]

8. Clin Physiol Funct Imaging. 2014 Jan 13. doi: 10.1111/cpf.12126. [Epub ahead of


Twenty-four hour variation in heart rate variability indices derived from

fractional differintegration.

Lewis MJ(1), McNarry MA.

Author information:

(1)College of Engineering, Swansea University, Swansea, UK.

Assuming that RR time-series behave as a fractionally differintegrated Gaussian

process, García-González et al. (2003) recently proposed new indices for

quantifying variability and structure in RR data. One of these was the

'fractional noise quantifier' (fnQ), measuring the departure of an RR time-series

from a monoFRACTAL structure (i.e. a measure of its multiFRACTALity). Sixty-nine

participants (aged = 34·5 ± 12·4 years, body mass index

(BMI) = 23·9 ± 2·9 kg m(-2) , maximal oxygen uptake rate (V˙O2peak

) = 42·4 ± 10·9 ml min(-1)  kg(-1) , 39 males) provided continuous beat-to-beat

ECG recordings for a 24-h period. Fractional differintegration was used to

quantify fnQ, and heart rate variability was calculated in the time domain. All

variables were evaluated during consecutive 1-h periods and also during four 6-h

blocks corresponding to morning, afternoon, evening and night periods. Apart from

RR, circadian trends in all variables were independent of gender (P = 0·11-0·59).

Apart from fnQ, all variables exhibited circadian variation (0·0005<P<0·012).

Although fnQ was statistically uniform during the 24-h period, it showed a trend

towards elevated values during evening and night. The main finding of this study

was that fnQ was elevated by around 10% during the evening and night, although

this was not statistically significant. This suggests that the structure of RR

time-series in healthy individuals is most strongly 'multiFRACTAL' during evening

and night periods. fnQ appears to be a plausible surrogate measure of

multiFRACTALity in RR time-series.

© 2014 Scandinavian Society of Clinical Physiology and Nuclear Medicine.

Published by John Wiley & Sons Ltd.

PMID: 24666809 [PubMed - as supplied by publisher]

9. ScientificWorldJournal. 2014 Jan 22;2014:894546. doi: 10.1155/2014/894546.

eCollection 2014.

MultiFRACTAL framework based on blanket method.

Paskaš MP(1), Reljin IS(2), Reljin BD(3).

Author information:

(1)School of Electrical Engineering, University of Belgrade, Bulevar kralja

Aleksandra 73, 11121 Belgrade, Serbia ; Innovation Center of School of Electrical

Engineering, University of Belgrade, Bulevar kralja Aleksandra 73, 11121

Belgrade, Serbia.

(2)School of Electrical Engineering, University of Belgrade, Bulevar kralja

Aleksandra 73, 11121 Belgrade, Serbia.

(3)Innovation Center of School of Electrical Engineering, University of Belgrade,

Bulevar kralja Aleksandra 73, 11121 Belgrade, Serbia.

This paper proposes two local multiFRACTAL measures motivated by blanket method

for calculation of FRACTAL dimension. They cover both FRACTAL approaches familiar

in image processing. The first two measures (proposed Methods 1 and 3) support

model of image with embedded dimension three, while the other supports model of

image embedded in space of dimension three (proposed Method 2). While the

classical blanket method provides only one value for an image (FRACTAL dimension)

multiFRACTAL spectrum obtained by any of the proposed measures gives a whole

range of dimensional values. This means that proposed multiFRACTAL blanket model

generalizes classical (monoFRACTAL) blanket method and other versions of this

monoFRACTAL approach implemented locally. Proposed measures are validated on

Brodatz image database through texture classification. All proposed methods give

similar classification results, while average computation time of Method 3 is

substantially longer.

PMCID: PMC3919058

PMID: 24578664 [PubMed - in process]

10. Water Res. 2014 Apr 15;53:322-8. doi: 10.1016/j.watres.2014.01.008. Epub 2014 Jan


Settling velocities of multiFRACTAL flocs formed in chemical coagulation process.

Vahedi A(1), Gorczyca B(2).

Author information:

(1)Red River College, Department of Civil Engineering Technology, Winnipeg,

Manitoba, Canada. Electronic address:

(2)Department of Civil Engineering, University of Manitoba, Canada. Electronic


A number of different flocculation mechanisms are involved in the formation of

chemical coagulation flocs. Consequently, two flocs with the same size may have

been formed by different mechanisms of aggregation and therefore have different

arrangement of primary particles. As a result, two flocs with the same size may

have different masses or mass distributions and therefore, different settling

velocities. Although the correct estimation of the floc mass and density is

critical for the development of the floc settling model, none of the suggested

floc settling models incorporate the information on mass distribution and

variable density of flocs. A probability-based method is used to determine the

floc FRACTAL dimensions on floc images. The results demonstrated that flocs

formed in lime softening coagulation are multiFRACTAL. The multiFRACTAL spectra

indicated the existence of a multiple FRACTAL dimensions as opposed to the unique

box-counting dimension which is a morphology-based FRACTAL dimensions typically

introduced into the Stokes' Law. These FRACTAL dimensions may provide information

on the flocs' aggregation mechanism, floc's structure, and the distribution of

mass inside the floc. More research is required to investigate how to utilize the

information obtained from the multiFRACTAL spectra to incorporate the variable

floc density and nonhomogeneous mass distribution of flocs into the floc settling


Copyright © 2014 Elsevier Ltd. All rights reserved.

PMID: 24530551 [PubMed - in process]

11. Soc Neurosci. 2014;9(3):219-34. doi: 10.1080/17470919.2014.882861. Epub 2014 Feb


Neural signatures of team coordination are revealed by multiFRACTAL analysis.

Likens AD(1), Amazeen PG, Stevens R, Galloway T, Gorman JC.

Author information:

(1)a Department of Psychology , Arizona State University , Tempe , USA.

The quality of a team depends on its ability to deliver information through a

hierarchy of team members and negotiate processes spanning different time scales.

That structure and the behavior that results from it pose problems for

researchers because multiply-nested interactions are not easily separated. We

explored the behavior of a six-person team engaged in a Submarine Piloting and

Navigation (SPAN) task using the tools of dynamical systems. The data were a

single entropy time series that showed the distribution of activity across six

team members, as recorded by nine-channel electroencephalography (EEG). A single

team's data were analyzed for the purposes of illustrating the utility of

multiFRACTAL analysis and allowing for in-depth exploratory analysis of temporal

characteristics. Could the meaningful events experienced by one of these teams be

captured using multiFRACTAL analysis, a dynamical systems tool that is

specifically designed to extract patterns across levels of analysis? Results

indicate that nested patterns of team activity can be identified from neural data

streams, including both routine and novel events. The novelty of this tool is the

ability to identify social patterns from the brain activity of individuals in the

social interaction. Implications for application and future directions of this

research are discussed.

PMID: 24517441 [PubMed - indexed for MEDLINE]

12. Comput Math Methods Med. 2013;2013:152828. doi: 10.1155/2013/152828. Epub 2013

Dec 17.

Coarse-grained multiFRACTALity analysis based on structure function measurements

to discriminate healthy from distressed foetuses.

Oudjemia S(1), Zaylaa A(2), Haddab S(1), Girault JM(2).

Author information:

(1)University of Mouloud Mammeri, Tizi-Ouzou, Algeria.

(2)Signal & Imaging Group, University François Rabelais of Tours, UMR INSERM U930,

PRES Loire Valley University, 7 Avenue Marcel Dassault, 37200 Tours, Cedex,


This paper proposes a combined coarse-grained multiFRACTAL method to discriminate

between distressed and normal foetuses. The coarse-graining operation was

performed by means of a coarse-grained procedure and the multiFRACTAL operation

was based on a structure function. The proposed method was evaluated by one

hundred recordings including eighty normal foetuses and twenty distressed

foetuses. We found that it was possible to discriminate between distressed and

normal foetuses using the Hurst exponent, singularity, and Holder spectra.

PMCID: PMC3877591

PMID: 24454527 [PubMed - indexed for MEDLINE]

13. Microvasc Res. 2014 Mar;92:62-71. doi: 10.1016/j.mvr.2014.01.005. Epub 2014 Jan


Functional slit lamp biomicroscopy for imaging bulbar conjunctival

microvasculature in contact lens wearers.

Jiang H(1), Zhong J(2), DeBuc DC(3), Tao A(4), Xu Z(4), Lam BL(3), Liu C(5), Wang


Author information:

(1)Bascom Palmer Eye Institute, University of Miami, Miami, FL, USA; Department of

Neurology, University of Miami, Miami, FL, USA. Electronic address:

(2)Bascom Palmer Eye Institute, University of Miami, Miami, FL, USA; Department of

Ophthalmology, Hangzhou First People's Hospital, Hangzhou, Zhejiang, China.

(3)Bascom Palmer Eye Institute, University of Miami, Miami, FL, USA.

(4)Bascom Palmer Eye Institute, University of Miami, Miami, FL, USA; School of

Ophthalmology and Optometry, Wenzhou Medical College, Wenzhou, Zhejiang, China.

(5)Bascom Palmer Eye Institute, University of Miami, Miami, FL, USA; Department of

Biomedical Engineering, University of Miami, Miami, FL, USA.

PURPOSE: To develop, test and validate functional slit lamp biomicroscopy (FSLB)

for generating non-invasive bulbar conjunctival microvascular perfusion maps

(nMPMs) and assessing morphometry and hemodynamics.

METHODS: FSLB was adapted from a traditional slit-lamp microscope by attaching a

digital camera to image the bulbar conjunctiva to create nMPMs and measure

venular blood flow hemodynamics. High definition images with a large field of

view were obtained on the temporal bulbar conjunctiva for creating nMPMs. A high

imaging rate of 60 frames per second and an ~210× high magnification were

achieved using the camera inherited high speed setting and Movie Crop Function,

for imaging hemodynamics. Custom software was developed to segment bulbar

conjunctival nMPMs for further FRACTAL analysis and quantitatively measure blood

vessel diameter, blood flow velocity and flow rate. Six human subjects were

imaged before and after 6h of wearing contact lenses. MonoFRACTAL and

multiFRACTAL analyses were performed to quantify FRACTALity of the nMPMs.

RESULTS: The mean bulbar conjunctival vessel diameter was 18.8 ± 2.7 μm at

baseline and increased to 19.6 ± 2.4 μm after 6h of lens wear (P=0.020). The

blood flow velocity was increased from 0.60 ± 0.12 mm/s to 0.88 ± 0.21 mm/s

(P=0.001). The blood flow rate was also increased from 129.8 ± 59.9 pl/s to 207.2

± 81.3 pl/s (P=0.001). Bulbar conjunctival nMPMs showed the intricate details of

the bulbar conjunctival microvascular network. At baseline, FRACTAL dimension was

1.63 ± 0.05 and 1.71 ± 0.03 analyzed by monoFRACTAL and multiFRACTAL analyses,

respectively. Significant increases in FRACTAL dimensions were found after 6h of

lens wear (P<0.05).

CONCLUSIONS: Microvascular network's FRACTALity, morphometry and hemodynamics of

the human bulbar conjunctiva can be measured easily and reliably using FSLB. The

alternations of the FRACTAL dimensions, morphometry and hemodynamics during

contact lens wear may indicate ocular microvascular responses to contact lens


Copyright © 2014 Elsevier Inc. All rights reserved.

PMCID: PMC3960300 [Available on 2015/3/1]

PMID: 24444784 [PubMed - in process]

14. Phys Rev E Stat Nonlin Soft Matter Phys. 2013 Oct;88(4):042116. Epub 2013 Oct 9.

Localization-delocalization transition of the instantaneous normal modes of

liquid water.

Huang BC(1), Chang CH.

Author information:

(1)Institute of Physics, National Chiao Tung University, Hsinchu 300, Taiwan.

Despite the fact that the localization-delocalization transition (LDT) widely

exists in wave systems, quantitative studies on its critical and multiFRACTAL

properties are mainly focused on solids. In this work, these properties are

investigated on the vibrational motions of liquid water. Simulations of up to

18000 molecules on the flexible extended simple point charge water model provide

nearly 10(6) instantaneous normal modes. They are shown to undergo an LDT close

to the translational transition and exhibit multiFRACTAL fluctuations while

approaching the LDT. In combination with finite-size scaling, multiFRACTAL

analysis predicts the critical frequency Im(ω(c))≈-131.6 cm(-1) for unstable

modes at room temperature. The estimated critical exponent ν≈1.60 is close to

those of other calculated systems in the same Wigner-Dyson class. At the LDT, the

FRACTAL spectrum f(α) and the most probable local vibrational intensity

α(mc)≈4.04 coincide with those of the Anderson model, which might be additional

universal properties of LDT in more general wave systems. The results extend the

validity of the multiFRACTAL scaling approach beyond Andersonian systems to a

Hessian system.

PMID: 24229125 [PubMed]

15. J Med Eng Technol. 2014 Jan;38(1):55-61. doi: 10.3109/03091902.2013.849298. Epub

2013 Nov 14.

MultiFRACTAL application on electrocardiogram.

Mercy Cleetus HM(1), Singh D.

Author information:

(1)Department of Instrumentation and Control, Dr B. R Ambedkar National Institute of

Technology , Jalandhar (Punjab) , India 144011.

MultiFRACTAL detrended fluctuation analysis is applied to analyse the degree of

disorders, complexity and irregularity based on scaling behaviour of

electrocardiogram. Since this method is based on random walk theory, noise level

due to imperfect measurement in recording is reduced and it can systematically

eliminate trends of different orders. The essence of this method is to extract

the FRACTAL features in ECG which can reflect changes in adaptability of

physiological processes and to classify the pathological conditions and can lead

to successful diagnosis. FRACTAL analysis is, therefore, a promising diagnostic

tool in cardiovascular disease diagnosis and evaluation.

PMID: 24228785 [PubMed - indexed for MEDLINE]

16. Biomed Mater Eng. 2014;24(1):163-71. doi: 10.3233/BME-130796.

Evaluation of breast cancer chemotherapy efficacy with multiFRACTAL spectrum

analysis of magnetic resonance image.

Li L(1), Hu WY, Liu LZ, Pang YC, Shao YZ.

Author information:

(1)School of Physics and Engineering, Sun Yat-sen University, Guangzhou 510275,

China State Key Laboratory of Oncology in Southern China, Imaging Diagnostic and

Interventional Center, Cancer Center, Sun Yat-sen University, Guangzhou 510060,


MultiFRACTAL spectrum analysis of dynamic contrast enhanced (DCE) breast MR

images was used to establish a new quantitative analysis method for solid tumor

blood perfusion and to explore its applicability in evaluating efficacy of breast

cancer chemotherapy. Five randomly selected patients suffering from newly

diagnosed malignant breast nodule lesions were enrolled in this study, and four

of them were treated with neoadjuvant chemotherapy. Their DCE breast MR images

were collected before and after treatment. Chemotherapeutic efficacy was analyzed

using international response evaluation criteria for solid tumors (RECIST).

Sandbox method for statistical number density was employed to measure and

calculate multiFRACTAL spectra of DCE breast MR images with spatiotemporal

characteristics. MultiFRACTAL spectral data of malignant lesions before and after

chemotherapy were compared. MultiFRACTAL spectra of malignant lesions show an

asymmetric bell-shape. Chemotherapy efficacy was assessed to be partial remission

(PR) for three patients and their multiFRACTAL spectral width significantly

increased after chemotherapy while to be stable disease (SD) for other patient

and of her changed slightly. MultiFRACTAL spectral width correlates with

blood-supply condition of tumor lesion before and after chemotherapy, providing a

potential suitable characteristic parameter for evaluating chemotherapeutic

efficacy quantitatively.

PMID: 24211895 [PubMed - indexed for MEDLINE]

17. J Theor Biol. 2014 Feb 21;343:44-53. doi: 10.1016/j.jtbi.2013.10.011. Epub 2013

Oct 31.

MultiFRACTAL analysis of neutral community spatial structure.

Yakimov BN(1), Iudin DI(2), Solntsev LA(3), Gelashvili DB(3).

Author information:

(1)Department of Ecology, Faculty of Biology, Nizhny Novgorod State University,

Prospekt Gagarina 23, Nizhny Novgorod 603950, Russia; Institute of Applied

Physics of the Russian Academy of Sciences, Ul'yanov Street 46, Nizhny Novgorod

603950, Russia. Electronic address:

(2)Department of Ecology, Faculty of Biology, Nizhny Novgorod State University,

Prospekt Gagarina 23, Nizhny Novgorod 603950, Russia; Institute of Applied

Physics of the Russian Academy of Sciences, Ul'yanov Street 46, Nizhny Novgorod

603950, Russia.

(3)Department of Ecology, Faculty of Biology, Nizhny Novgorod State University,

Prospekt Gagarina 23, Nizhny Novgorod 603950, Russia.

The spatial structure of neutral communities has nontrivial properties, which are

described traditionally by the Species-area relationship (SAR) and the Species

Abundance Distribution, (SAD). FRACTAL analysis is an alternative way to describe

community structure, the final product of which - a multiFRACTAL spectrum -

combines information both on the scaling parameters of species richness (similar

to SAR), and about species' relative abundances (similar to SAD). We conducted a

multiFRACTAL analysis of community spatial structure in a neutral lattice-based

model. In a realistic range of dispersal distances, moments of the species

abundance distribution form a family of curves of the same shape, which are

reduced to a single universal curve through a scaling collapse procedure. Trivial

scaling is observed on small and large scales, which reflects homogeneity of

species distribution at small scales and a limiting log-series distribution at

large scales. MultiFRACTAL spectra for different speciation rates and dispersal

kernels are obtained for the intermediate region of scaling. Analysis of spectra

reveals that the key model parameters determine not only the species richness and

its scaling, but also of species dominance and rarity. We discovered a phenomenon

of negative dimensions in the multiFRACTAL spectrum. Negative dimensions have no

direct interpretation from a purely physical point of view, but have biological

meaning because they reflect the negative relationship between the number of

singletons and the area.

© 2013 Elsevier Ltd. All rights reserved.

PMID: 24184220 [PubMed - in process]

18. Biochim Biophys Acta. 2014 Jan;1840(1):565-76. doi: 10.1016/j.bbagen.2013.10.015.

Epub 2013 Oct 17.

Physico-chemical characterization and the in vitro genotoxicity of medical

implants metal alloy (TiAlV and CoCrMo) and polyethylene particles in human


Gajski G(1), Jelčić Z, Oreščanin V, Gerić M, Kollar R, Garaj-Vrhovac V.

Author information:

(1)Institute for Medical Research and Occupational Health, Mutagenesis Unit, 10000

Zagreb, Croatia. Electronic address:

BACKGROUND: The main objective of the present study was to investigate chemical

composition and possible cyto/genotoxic potential of several medical implant

materials commonly used in total hip joint replacement.

METHODS: Medical implant metal alloy (Ti6Al4V and CoCrMo) and high density

polyethylene particles were analyzed by energy dispersive X-ray spectrometry

while toxicological characterization was done on human lymphocytes using

multi-biomarker approach.

RESULTS: Energy dispersive X-ray spectrometry showed that none of the elements

identified deviate from the chemical composition defined by appropriate ISO

standard. Toxicological characterization showed that the tested materials were

non-cyto/genotoxic as determined by the comet and cytokinesis-block micronucleus

(CBMN) assay. Particle morphology was found (by using scanning electron and

optical microscope) as flat, sharp-edged, irregularly shaped fiber-like grains

with the mean particle size less than 10µm; this corresponds to the so-called

"submicron wear". The very large surface area per wear volume enables high

reactivity with surrounding media and cellular elements.

CONCLUSIONS: Although orthopedic implants proved to be non-cyto/genotoxic, in

tested concentration (10μg/ml) there is a constant need for monitoring of

patients that have implanted artificial hips or other joints, to minimize the

risks of any unwanted health effects.

GENERAL SIGNIFICANCE: The FRACTAL and multiFRACTAL analyses, performed in order

to evaluate the degree of particle shape effect, showed that the FRACTAL and

multiFRACTAL terms are related to the "remnant" level of the particles' toxicity

especially with the cell viability (trypan blue method) and total number of

nucleoplasmic bridges and nuclear buds as CBMN assay parameters.

© 2013.

PMID: 24140394 [PubMed - indexed for MEDLINE]

19. Front Neurol. 2013 Oct 9;4:158. doi: 10.3389/fneur.2013.00158. eCollection 2013.

Long-Range Temporal Correlations, MultiFRACTALity, and the Causal Relation

between Neural Inputs and Movements.

Hu J(1), Zheng Y, Gao J.

Author information:

(1)Institute of Complexity Science and Big Data Technology, Guangxi University ,

Nanning , China ; PMB Intelligence LLC , Sunnyvale, CA , USA.

Understanding the causal relation between neural inputs and movements is very

important for the success of brain-machine interfaces (BMIs). In this study, we

analyze 104 neurons' firings using statistical, information theoretic, and

FRACTAL analysis. The latter include Fano factor analysis, multiFRACTAL adaptive

FRACTAL analysis (MF-AFA), and wavelet multiFRACTAL analysis. We find neuronal

firings are highly non-stationary, and Fano factor analysis always indicates

long-range correlations in neuronal firings, irrespective of whether those

firings are correlated with movement trajectory or not, and thus does not reveal

any actual correlations between neural inputs and movements. On the other hand,

MF-AFA and wavelet multiFRACTAL analysis clearly indicate that when neuronal

firings are not well correlated with movement trajectory, they do not have or

only have weak temporal correlations. When neuronal firings are well correlated

with movements, they are characterized by very strong temporal correlations, up

to a time scale comparable to the average time between two successive reaching

tasks. This suggests that neurons well correlated with hand trajectory

experienced a "re-setting" effect at the start of each reaching task, in the

sense that within the movement correlated neurons the spike trains' long-range

dependences persisted about the length of time the monkey used to switch between

task executions. A new task execution re-sets their activity, making them only

weakly correlated with their prior activities on longer time scales. We further

discuss the significance of the coalition of those important neurons in executing

cortical control of prostheses.

PMCID: PMC3793199

PMID: 24130549 [PubMed]

20. IEEE Trans Image Process. 2013 Nov;22(11):4422-35. doi: 10.1109/TIP.2013.2273669.

3D lacunarity in multiFRACTAL analysis of breast tumor lesions in dynamic

contrast-enhanced magnetic resonance imaging.

Soares F, Janela F, Pereira M, Seabra J, Freire MM.

Dynamic contrast-enhanced magnetic resonance (DCE-MR) of the breast is especially

robust for the diagnosis of cancer in high-risk women due to its high

sensitivity. Its specificity may be, however, compromised since several benign

masses take up contrast agent as malignant lesions do. In this paper, we propose

a novel method of 3D multiFRACTAL analysis to characterize the spatial complexity

(spatial arrangement of texture) of breast tumors at multiple scales.

Self-similar properties are extracted from the estimation of the multiFRACTAL

scaling exponent for each clinical case, using lacunarity as the multiFRACTAL

measure. These properties include several descriptors of the multiFRACTAL spectra

reflecting the morphology and internal spatial structure of the enhanced lesions

relatively to normal tissue. The results suggest that the combined multiFRACTAL

characteristics can be effective to distinguish benign and malignant findings,

judged by the performance of the support vector machine classification method

evaluated by receiver operating characteristics with an area under the curve of

0.96. In addition, this paper confirms the presence of multiFRACTALity in DCE-MR

volumes of the breast, whereby multiple degrees of self-similarity prevail at

multiple scales. The proposed feature extraction and classification method have

the potential to complement the interpretation of the radiologists and supply a

computer-aided diagnosis system.

PMID: 24057004 [PubMed - indexed for MEDLINE]

21. Hum Mov Sci. 2013 Aug;32(4):633-51. doi: 10.1016/j.humov.2013.01.008. Epub 2013

Aug 22.

MultiFRACTAL formalisms of human behavior.

Ihlen EA(1), Vereijken B.

Author information:

(1)Department of Neuroscience, Norwegian University of Science and Technology,

Trondheim, Norway. Electronic address:

With the mounting realization that variability is an inevitable part of human

behavior comes the need to integrate this phenomenon in concomitant models and

theories of motor control. Among other things, this has resulted in a debate

throughout the last decades about the origin of variability in behavior, the

outcome of which has important implications for motor control theories. To date,

a monoFRACTAL formalism of variability has been used as the basis for arguing for

component- versus interaction-oriented theories of motor control. However,

monoFRACTAL formalism alone cannot decide between the opposing sides of the

debate. The present theoretical overview introduces multiFRACTAL formalisms as a

necessary extension of the conventional monoFRACTAL formalism. In multiFRACTAL

formalisms, the scale invariance of behavior is numerically defined as a spectrum

of scaling exponents, rather than a single average exponent as in the monoFRACTAL

formalism. Several methods to estimate the multiFRACTAL spectrum of scaling

exponents - all within two multiFRACTAL formalisms called large deviation and

Legendre formalism - are introduced and briefly discussed. Furthermore, the

multiFRACTAL analyses within these two formalisms are applied to several

performance tasks to illustrate how explanations of motor control vary with the

methods used. The main section of the theoretical overview discusses the

implications of multiFRACTAL extensions of the component- and

interaction-oriented models for existing theories of motor control.

Copyright © 2013 Elsevier B.V. All rights reserved.

PMID: 24054900 [PubMed - indexed for MEDLINE]

22. Comput Math Methods Med. 2013;2013:262931. doi: 10.1155/2013/262931. Epub 2013

Aug 19.

Scale-specific multiFRACTAL medical image analysis.

Braverman B(1), Tambasco M.

Author information:

(1)Department of Physics, MIT-Harvard Center for Ultracold Atoms and Research

Laboratory of Electronics, Massachusetts Institute of Technology, Cambridge, MA

02139, USA.

FRACTAL geometry has been applied widely in the analysis of medical images to

characterize the irregular complex tissue structures that do not lend themselves

to straightforward analysis with traditional Euclidean geometry. In this study,

we treat the nonFRACTAL behaviour of medical images over large-scale ranges by

considering their box-counting FRACTAL dimension as a scale-dependent parameter

rather than a single number. We describe this approach in the context of the more

generalized Rényi entropy, in which we can also compute the information and

correlation dimensions of images. In addition, we describe and validate a

computational improvement to box-counting FRACTAL analysis. This improvement is

based on integral images, which allows the speedup of any box-counting or similar

FRACTAL analysis algorithm, including estimation of scale-dependent dimensions.

Finally, we applied our technique to images of invasive breast cancer tissue from

157 patients to show a relationship between the FRACTAL analysis of these images

over certain scale ranges and pathologic tumour grade (a standard prognosticator

for breast cancer). Our approach is general and can be applied to any medical

imaging application in which the complexity of pathological image structures may

have clinical value.

PMCID: PMC3760300

PMID: 24023588 [PubMed - indexed for MEDLINE]

23. PLoS One. 2013 Aug 21;8(8):e69000. doi: 10.1371/journal.pone.0069000. eCollection


MultiFRACTAL analysis for nutritional assessment.

Park Y(1), Lee K, Ziegler TR, Martin GS, Hebbar G, Vidakovic B, Jones DP.

Author information:

(1)Division of Pulmonary, Allergy and Critical Care Medicine, Department of

Medicine, Emory University, Atlanta, Georgia, United States of America ; College

of Pharmacy, Korea University, Sejong City, Korea.

The concept of multiFRACTALity is currently used to describe self-similar and

complex scaling properties observed in numerous biological signals. FRACTALs are

geometric objects or dynamic variations which exhibit some degree of similarity

(irregularity) to the original object in a wide range of scales. This approach

determines irregularity of biologic signal as an indicator of adaptability, the

capability to respond to unpredictable stress, and health. In the present work,

we propose the application of multiFRACTAL analysis of wavelet-transformed proton

nuclear magnetic resonance ((1)H NMR) spectra of plasma to determine nutritional

insufficiency. For validation of this method on (1)H NMR signal of human plasma,

standard deviation from classical statistical approach and Hurst exponent (H),

left slope and partition function from multiFRACTAL analysis were extracted from

(1)H NMR spectra to test whether multiFRACTAL indices could discriminate healthy

subjects from unhealthy, intensive care unit patients. After validation, the

multiFRACTAL approach was applied to spectra of plasma from a modified crossover

study of sulfur amino acid insufficiency and tested for associations with blood

lipids. The results showed that standard deviation and H, but not left slope,

were significantly different for sulfur amino acid sufficiency and insufficiency.

Quadratic discriminant analysis of H, left slope and the partition function

showed 78% overall classification accuracy according to sulfur amino acid status.

Triglycerides and apolipoprotein C3 were significantly correlated with a

multiFRACTAL model containing H, left slope, and standard deviation, and

cholesterol and high-sensitivity C-reactive protein were significantly correlated

to H. In conclusion, multiFRACTAL analysis of (1)H NMR spectra provides a new

approach to characterize nutritional status.

PMCID: PMC3749179

PMID: 23990878 [PubMed - indexed for MEDLINE]

24. Carbohydr Polym. 2013 Sep 12;97(2):269-76. doi: 10.1016/j.carbpol.2013.04.099.

Epub 2013 May 9.

Fracture behavior of a commercial starch/polycaprolactone blend reinforced with

different layered silicates.

Pérez E(1), Pérez CJ, Alvarez VA, Bernal C.

Author information:

(1)INTECIN (UBA-CONICET), Engineering Faculty, University of Buenos Aires, Av. Las

Heras 2214, C1127AAR Buenos Aires, Argentina.

In the present work, composites based on a commercial starch/PCL blend

(MaterBi-Z) reinforced with three different nanoclays: natural montmorillonite

(Cloisite Na(+) (MMT)) and two modified montmorillonites (Cloisite 30B (C30B) and

Cloisite 10A (C10A)) were prepared in an intensive mixer. The aim of this

investigation was to determine the effect of the different nanoclays on the

quasi-static fracture behavior of MaterBi-Z nanocomposites. An improvement in the

fracture behavior for the composite with low contents of C30B was obtained,

probably due to the easy debonding of clay achieved from a relatively weak

filler-matrix interaction. On the other hand, a strong interaction had a

detrimental effect on the material fracture toughness for the MaterBi-Z/C10A

composites as a result of the higher compatibility of this organo-modified clay

with the hydrophobic matrix. Intermediate values of fracture toughness,

determined using the J-integral approach (Jc), were found for the composites with

MMT due to its intermediate interaction with the matrix. The different

filler-matrix interactions observed were also confirmed from the application of

Pukánszky and Maurer model. In addition, multiFRACTAL analysis was applied to

describe the topography of fracture surfaces. Thus, the complex fracture process

could be successfully described by both experimental and theoretical tools. The

obtained results suggest that it is possible to tailor the mechanical properties

of the studied composites taking into account their further application.

Copyright © 2013 Elsevier Ltd. All rights reserved.

PMID: 23911445 [PubMed - indexed for MEDLINE]

25. PLoS One. 2013 Jul 3;8(7):e68360. doi: 10.1371/journal.pone.0068360. Print 2013.

MultiFRACTAL detrended fluctuation analysis of human EEG: preliminary

investigation and comparison with the wavelet transform modulus maxima technique.

Zorick T(1), Mandelkern MA.

Author information:

(1)Department of Psychiatry, Greater Los Angeles Veterans Administration Healthcare

System, Los Angeles, California, United States of America.

Recently, many lines of investigation in neuroscience and statistical physics

have converged to raise the hypothesis that the underlying pattern of neuronal

activation which results in electroencephalography (EEG) signals is nonlinear,

with self-affine dynamics, while scalp-recorded EEG signals themselves are

nonstationary. Therefore, traditional methods of EEG analysis may miss many

properties inherent in such signals. Similarly, FRACTAL analysis of EEG signals

has shown scaling behaviors that may not be consistent with pure monoFRACTAL

processes. In this study, we hypothesized that scalp-recorded human EEG signals

may be better modeled as an underlying multiFRACTAL process. We utilized the

Physionet online database, a publicly available database of human EEG signals as

a standardized reference database for this study. Herein, we report the use of

multiFRACTAL detrended fluctuation analysis on human EEG signals derived from

waking and different sleep stages, and show evidence that supports the use of

multiFRACTAL methods. Next, we compare multiFRACTAL detrended fluctuation

analysis to a previously published multiFRACTAL technique, wavelet transform

modulus maxima, using EEG signals from waking and sleep, and demonstrate that

multiFRACTAL detrended fluctuation analysis has lower indices of variability.

Finally, we report a preliminary investigation into the use of multiFRACTAL

detrended fluctuation analysis as a pattern classification technique on human EEG

signals from waking and different sleep stages, and demonstrate its potential

utility for automatic classification of different states of consciousness.

Therefore, multiFRACTAL detrended fluctuation analysis may be a useful pattern

classification technique to distinguish among different states of brain function.

PMCID: PMC3700954

PMID: 23844189 [PubMed - indexed for MEDLINE]

26. IEEE Trans Biomed Eng. 2013 Nov;60(11):3204-15. doi: 10.1109/TBME.2013.2271383.

Epub 2013 Jun 27.

MultiFRACTAL texture estimation for detection and segmentation of brain tumors.

Islam A, Reza SM, Iftekharuddin KM.

A stochastic model for characterizing tumor texture in brain magnetic resonance

(MR) images is proposed. The efficacy of the model is demonstrated in

patient-independent brain tumor texture feature extraction and tumor segmentation

in magnetic resonance images (MRIs). Due to complex appearance in MRI, brain

tumor texture is formulated using a multiresolution-FRACTAL model known as

multifractional Brownian motion (mBm). Detailed mathematical derivation for mBm

model and corresponding novel algorithm to extract spatially varying multiFRACTAL

features are proposed. A multiFRACTAL feature-based brain tumor segmentation

method is developed next. To evaluate efficacy, tumor segmentation performance

using proposed multiFRACTAL feature is compared with that using Gabor-like

multiscale texton feature. Furthermore, novel patient-independent tumor

segmentation scheme is proposed by extending the well-known AdaBoost algorithm.

The modification of AdaBoost algorithm involves assigning weights to component

classifiers based on their ability to classify difficult samples and confidence

in such classification. Experimental results for 14 patients with over 300 MRIs

show the efficacy of the proposed technique in automatic segmentation of tumors

in brain MRIs. Finally, comparison with other state-of-the art brain tumor

segmentation works with publicly available low-grade glioma BRATS2012 dataset

show that our segmentation results are more consistent and on the average

outperforms these methods for the patients where ground truth is made available.

PMID: 23807424 [PubMed - indexed for MEDLINE]

27. Microvasc Res. 2013 Sep;89:172-5. doi: 10.1016/j.mvr.2013.06.008. Epub 2013 Jun


Automated segmentation and FRACTAL analysis of high-resolution non-invasive

capillary perfusion maps of the human retina.

Jiang H(1), Debuc DC, Rundek T, Lam BL, Wright CB, Shen M, Tao A, Wang J.

Author information:

(1)Bascom Palmer Eye Institute, University of Miami, Miami, FL, USA; Departemnt of

Neurology, University of Miami, Miami, 33136, USA.

The retina provides a window to study the pathophysiology of cerebrovascular

diseases. Pathological retinal microvascular changes may reflect microangiopathic

processes in the brain. Recent advances in optical imaging techniques have

enabled the imaging of the retinal microvasculature at the capillary level, and

the generation of high-resolution, non-invasive capillary perfusion maps (nCPMs)

with the Retinal Function Imager (RFI). However, the lack of quantitative

analyses of the nCPMs may limit the wider application of the method in clinical

research. The goal of this project was to demonstrate the feasibility of

automated segmentation and FRACTAL analysis of nCPMs. We took two nCPMs of each

subject in a group of 6 healthy volunteers and used our segmentation algorithm to

do the automated segmentation for monoFRACTAL and multiFRACTAL analyses. The

monoFRACTAL dimension was 1.885±0.020, and the multiFRACTAL dimension was

1.876±0.010 (P=0.108). The coefficient of repeatability was 0.070 for monoFRACTAL

analysis and 0.026 for multiFRACTAL analysis. This study demonstrated that the

automatic segmentation of nCPMs is feasible for FRACTAL analyses. Both

monoFRACTAL and multiFRACTAL analyses yielded similar results. The quantitative

analyses of microvasculature at the capillary level may open up a new era for

studying microvascular diseases such as cerebral small vessel disease.

Copyright © 2013 Elsevier Inc. All rights reserved.

PMCID: PMC3773708

PMID: 23806780 [PubMed - indexed for MEDLINE]

28. Comput Math Methods Med. 2013;2013:376152. doi: 10.1155/2013/376152. Epub 2013

May 20.

Classification of prolapsed mitral valve versus healthy heart from

phonocardiograms by multiFRACTAL analysis.

Gavrovska A(1), Zajić G, Reljin I, Reljin B.

Author information:

(1)Research and Development Department, Innovation Center of the School of

Electrical Engineering in Belgrade, Bulevar Kralja Aleksandra 73, 11120 Belgrade,


Phonocardiography has shown a great potential for developing low-cost

computer-aided diagnosis systems for cardiovascular monitoring. So far, most of

the work reported regarding cardiosignal analysis using multiFRACTALs is oriented

towards heartbeat dynamics. This paper represents a step towards automatic

detection of one of the most common pathological syndromes, so-called mitral

valve prolapse (MVP), using phonocardiograms and multiFRACTAL analysis. Subtle

features characteristic for MVP in phonocardiograms may be difficult to detect.

The approach for revealing such features should be locally based rather than

globally based. Nevertheless, if their appearances are specific and frequent,

they can affect a multiFRACTAL spectrum. This has been the case in our experiment

with the click syndrome. Totally, 117 pediatric phonocardiographic recordings

(PCGs), 8 seconds long each, obtained from 117 patients were used for PMV

automatic detection. We propose a two-step algorithm to distinguish PCGs that

belong to children with healthy hearts and children with prolapsed mitral valves

(PMVs). Obtained results show high accuracy of the method. We achieved 96.91%

accuracy on the dataset (97 recordings). Additionally, 90% accuracy is achieved

for the evaluation dataset (20 recordings). Content of the datasets is confirmed

by the echocardiographic screening.

PMCID: PMC3671509

PMID: 23762185 [PubMed - indexed for MEDLINE]

29. Front Comput Neurosci. 2013 Jun 4;7:72. doi: 10.3389/fncom.2013.00072.

eCollection 2013.

The relevance of network micro-structure for neural dynamics.

Pernice V(1), Deger M, Cardanobile S, Rotter S.

Author information:

(1)Bernstein Center Freiburg and Faculty of Biology, Albert-Ludwig University

Freiburg, Germany.

The activity of cortical neurons is determined by the input they receive from

presynaptic neurons. Many previous studies have investigated how specific aspects

of the statistics of the input affect the spike trains of single neurons and

neurons in recurrent networks. However, typically very simple random network

models are considered in such studies. Here we use a recently developed algorithm

to construct networks based on a quasi-FRACTAL probability measure which are much

more variable than commonly used network models, and which therefore promise to

sample the space of recurrent networks in a more exhaustive fashion than

previously possible. We use the generated graphs as the underlying network

topology in simulations of networks of integrate-and-fire neurons in an

asynchronous and irregular state. Based on an extensive dataset of networks and

neuronal simulations we assess statistical relations between features of the

network structure and the spiking activity. Our results highlight the strong

influence that some details of the network structure have on the activity

dynamics of both single neurons and populations, even if some global network

parameters are kept fixed. We observe specific and consistent relations between

activity characteristics like spike-train irregularity or correlations and

network properties, for example the distributions of the numbers of in- and

outgoing connections or clustering. Exploiting these relations, we demonstrate

that it is possible to estimate structural characteristics of the network from

activity data. We also assess higher order correlations of spiking activity in

the various networks considered here, and find that their occurrence strongly

depends on the network structure. These results provide directions for further

theoretical studies on recurrent networks, as well as new ways to interpret spike

train recordings from neural circuits.

PMCID: PMC3671286

PMID: 23761758 [PubMed]

30. Bull Math Biol. 2013 Sep;75(9):1544-70. doi: 10.1007/s11538-013-9859-9. Epub 2013

Jun 13.

On the FRACTAL geometry of DNA by the binary image analysis.

Cattani C(1), Pierro G.

Author information:

(1)Department of Mathematics, University of Salerno, Via Ponte Don Melillo, 84084,

Fisciano (SA), Italy.

The multiFRACTAL analysis of binary images of DNA is studied in order to define a

methodological approach to the classification of DNA sequences. This method is

based on the computation of some multiFRACTALity parameters on a suitable binary

image of DNA, which takes into account the nucleotide distribution. The binary

image of DNA is obtained by a dot-plot (recurrence plot) of the indicator matrix.

The FRACTAL geometry of these images is characterized by FRACTAL dimension (FD),

lacunarity, and succolarity. These parameters are compared with some other

coefficients such as complexity and Shannon information entropy. It will be shown

that the complexity parameters are more or less equivalent to FD, while the

parameters of multiFRACTALity have different values in the sense that sequences

with higher FD might have lower lacunarity and/or succolarity. In particular, the

genome of Drosophila melanogaster has been considered by focusing on the

chromosome 3r, which shows the highest FRACTALity with a corresponding higher

level of complexity. We will single out some results on the nucleotide

distribution in 3r with respect to complexity and FRACTALity. In particular, we

will show that sequences with higher FD also have a higher frequency distribution

of guanine, while low FD is characterized by the higher presence of adenine.

PMID: 23760660 [PubMed - indexed for MEDLINE]

31. J Periodontal Res. 2014 Apr;49(2):186-96. doi: 10.1111/jre.12093. Epub 2013 May


Evaluation of scaling and root planing effect in generalized chronic

periodontitis by FRACTAL and multiFRACTAL analysis.

Pârvu AE(1), Ţălu Ş, Crăciun C, Alb SF.

Author information:

(1)Department of Pathophysiology, Faculty of Medicine, "Iuliu Haţieganu" University

of Medicine and Pharmacy, Cluj-Napoca, Romania.

BACKGROUND AND OBJECTIVE: FRACTAL and multiFRACTAL analysis are useful additional

non-invasive methods for quantitative description of complex morphological

features. However, the quantitative and qualitative assessment of morphologic

changes within human gingival cells and tissues are still unexplored. The aim of

this work is to assess the structural gingival changes in patients with

generalized chronic periodontitis (GCP), before and after scaling and root

planing (SRP) by using FRACTAL and multiFRACTAL analysis.

MATERIAL AND METHODS: Twelve adults with untreated chronic periodontitis were

treated only by SRP. At baseline and after SRP, gingivomucosal biopsies were

collected for histopathological examination. FRACTAL and multiFRACTAL analysis of

digital images of the granular, spinous and basal and conjunctive layers

structure, using the standard box-counting method was performed. The FRACTAL

dimension was determined for cell membrane, nuclear membrane of cell and

nucleolus membrane of cell.

RESULTS: In GCP a higher FRACTAL dimension corresponds to a higher geometric

complexity of cells contour, as its values increase when the contour

irregularities increase. The generalized FRACTAL dimensions were determined for

the conjunctive layer structure of patients with GCP and patients with GCP and

SRP. The FRACTAL and multiFRACTAL analysis of gingival biopsies confirmed earlier

findings that SRP reduces gingival injury in patients with GCP.

CONCLUSION: It has been shown that FRACTAL and multiFRACTAL analysis of tissue

images as a non-invasive technique could be used to measure contrasting

morphologic changes within human gingival cells and tissues and can provide

detailed information for investigation of healthy and diseased gingival mucosa

from patients with GCP.

© 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

PMID: 23668776 [PubMed - in process]

32. Chaos. 2013 Mar;23(1):013133. doi: 10.1063/1.4793781.

MultiFRACTAL analysis of validated wind speed time series.

García-Marín AP(1), Estévez J, Jiménez-Hornero FJ, Ayuso-Muñoz JL.

Author information:

(1)Department of Rural Engineering, University of Cordoba, P.O. Box 3048, 14080

Córdoba, Spain.

MultiFRACTAL properties of 30 min wind data series recorded at six locations in

Cadiz (Southern Spain) have been studied in this work with the aim of obtaining

detailed information for a range of time scales. Wind speed records have been

validated, applying various quality control tests as a pre-requisite before their

use, improving the reliability of the results due to the identification of

incorrect values which have been discarded in the analysis. The scaling of the

wind speed moments has been analysed and empirical moments scaling exponent

functions K(q) have been obtained. Although the same critical moment (qcrit) has

been obtained for all the places, some differences appear in other multiFRACTAL

parameters like γmax and the value of K(0). These differences have been related

to the presence of extreme events and zero data values in the data series

analysed, respectively.

PMID: 23556970 [PubMed - indexed for MEDLINE]

33. Chaos. 2013 Mar;23(1):013129. doi: 10.1063/1.4793355.

Cross-correlation detection and analysis for California's electricity market

based on analogous multiFRACTAL analysis.

Wang F(1), Liao GP, Li JH, Zou RB, Shi W.

Author information:

(1)Science College, Hunan Agricultural University, Changsha 410128, China.

A novel method, which we called the analogous multiFRACTAL cross-correlation

analysis, is proposed in this paper to study the multiFRACTAL behavior in the

power-law cross-correlation between price and load in California electricity

market. In addition, a statistic ρAMF-XA, which we call the analogous

multiFRACTAL cross-correlation coefficient, is defined to test whether the

cross-correlation between two given signals is genuine or not. Our analysis finds

that both the price and load time series in California electricity market express

multiFRACTAL nature. While, as indicated by the ρAMF-XA statistical test, there

is a huge difference in the cross-correlation behavior between the years 1999 and

2000 in California electricity markets.

PMID: 23556966 [PubMed - indexed for MEDLINE]

34. Curr Eye Res. 2013 Jul;38(7):781-92. doi: 10.3109/02713683.2013.779722. Epub 2013

Mar 28.

MultiFRACTAL geometry in analysis and processing of digital retinal photographs

for early diagnosis of human diabetic macular edema.

Tălu S.

Author information:

Department of AET, Discipline of Descriptive Geometry and Engineering Graphics,

Faculty of Mechanics, The Technical University of Cluj-Napoca, Cluj-Napoca,


OBJECTIVE: The purpose of this paper is to determine a quantitative assessment of

the human retinal vascular network architecture for patients with diabetic

macular edema (DME). MultiFRACTAL geometry and lacunarity parameters are used in

this study.

MATERIALS AND METHODS: A set of 10 segmented and skeletonized human retinal

images, corresponding to both normal (five images) and DME states of the retina

(five images), from the DRIVE database was analyzed using the Image J software.

Statistical analyses were performed using Microsoft Office Excel 2003 and

GraphPad InStat software.

RESULTS: The human retinal vascular network architecture has a multiFRACTAL

geometry. The average of generalized dimensions (Dq) for q = 0, 1, 2 of the

normal images (segmented versions), is similar to the DME cases (segmented

versions). The average of generalized dimensions (Dq) for q = 0, 1 of the normal

images (skeletonized versions), is slightly greater than the DME cases

(skeletonized versions). However, the average of D2 for the normal images

(skeletonized versions) is similar to the DME images. The average of lacunarity

parameter, Λ, for the normal images (segmented and skeletonized versions) is

slightly lower than the corresponding values for DME images (segmented and

skeletonized versions).

CONCLUSION: The multiFRACTAL and lacunarity analysis provides a non-invasive

predictive complementary tool for an early diagnosis of patients with DME.

PMID: 23537336 [PubMed - indexed for MEDLINE]

35. Behav Res Methods. 2013 Dec;45(4):928-45. doi: 10.3758/s13428-013-0317-2.

MultiFRACTAL analyses of response time series: a comparative study.

Ihlen EA.

Author information:

Department of Neuroscience, Norwegian University of Science and Technology, 7489,

Trondheim, Norway,

Response time series with a non-Gaussian distribution and long-range dependent

dynamics have been reported for several cognitive tasks. Conventional monoFRACTAL

analyses numerically define a long-range dependency as a single scaling exponent,

but they assume that the response times are Gaussian distributed. Ihlen and

Vereijken (Journal of Experimental Psychology: General, 139, 436-463, 2010)

suggested multiFRACTAL extensions of the conventional monoFRACTAL analyses that

are more suitable when the response time has a non-Gaussian distribution.

MultiFRACTAL analyses estimate a multiFRACTAL spectrum of scaling exponents that

contain the single exponent estimated by the conventional monoFRACTAL analyses.

However, a comparison of the performance of multiFRACTAL analyses with behavioral

variables has not yet been addressed. The present study compares the performance

of seven multiFRACTAL analyses. The multiFRACTAL analyses were tested on

multiplicative cascading noise that generates time series with a predefined

multiFRACTAL spectrum and with a structure of variation that mimics intermittent

response time variation. Time series with 1,024 and 4,096 samples were generated

with additive noise and multiharmonic trends of two different magnitudes

(signal-to-noise/trend ratio; 0.33 and 1). The results indicate that all

multiFRACTAL analysis has individual pros and cons related to sample size,

multiFRACTALity, and the presence of additive noise and trends in the response

time series. The summary of pros and cons of the seven multiFRACTAL analyses

provides a guideline for the choice of multiFRACTAL analyses of response time

series and other behavioral variables.

PMID: 23526256 [PubMed - indexed for MEDLINE]

36. Phys Rev E Stat Nonlin Soft Matter Phys. 2013 Feb;87(2):022918. Epub 2013 Feb 28.

Performance of multiFRACTAL detrended fluctuation analysis on short time series.

López JL(1), Contreras JG.

Author information:

(1)Departamento de Física Aplicada, Centro de Investigación y de Estudios Avanzados

del Instituto Politécnico Nacional, Unidad Mérida, A.P. 73 Cordemex, 97310

Mérida, Yucatán, México.

The performance of the multiFRACTAL detrended analysis on short time series is

evaluated for synthetic samples of several mono- and multiFRACTAL models. The

reconstruction of the generalized Hurst exponents is used to determine the range

of applicability of the method and the precision of its results as a function of

the decreasing length of the series. As an application the series of the daily

exchange rate between the U.S. dollar and the euro is studied.

PMID: 23496602 [PubMed - indexed for MEDLINE]

37. Phys Rev E Stat Nonlin Soft Matter Phys. 2013 Feb;87(2):022821. Epub 2013 Feb 28.

Wavelet-based multiFRACTAL analysis of nonlinear time series: the

earthquake-driven tsunami of 27 February 2010 in Chile.

Toledo BA(1), Chian AC, Rempel EL, Miranda RA, Muñoz PR, Valdivia JA.

Author information:

(1)Institute of Aeronautical Technology (ITA) and World Institute for Space

Environment Research (WISER), CTA/ITA/IEFM, São José dos Campos-SP 12228-900,


We study general multiFRACTAL properties of tidal gauge and long-wave time series

which show a well defined transition between two states, as is the case of sea

level when a tsunami arrives. We adopt a method based on discrete wavelets,

called wavelet leaders, which has been successfully used in a wide range of

applications from image analysis to biomedical signals. First, we analyze an

empirical time series of tidal gauge from the tsunami event of 27 February 2010

in Chile. Then, we study a numerical solution of the driven-damped regularized

long-wave equation (RLWE) which displays on-off intermittency. Both time series

are characterized by a sudden change between two sharply distinct dynamical

states. Our analysis suggests a correspondence between the pre- and post-tsunami

states (ocean background) and the on state in the RLWE, and also between the

tsunami state (disturbed ocean) and the off state in the RLWE. A qualitative

similarity in their singularity spectra is observed, and since the RLWE is used

to model shallow water dynamics, this result could imply an underlying dynamical


PMID: 23496582 [PubMed - indexed for MEDLINE]

38. Opt Lett. 2013 Jan 15;38(2):211-3. doi: 10.1364/OL.38.000211.

Probing multiFRACTALity in tissue refractive index: prospects for precancer


Das N(1), Chatterjee S, Soni J, Jagtap J, Pradhan A, Sengupta TK, Panigrahi PK,

Vitkin IA, Ghosh N.

Author information:

(1)IISER-Kolkata, BCKV Main Campus, Mohanpur, Nadia, West Bengal, India.

Multiresolution analysis on the spatial refractive index inhomogeneities in the

epithelium and connective tissue regions of a human cervix reveals a clear

signature of multiFRACTALity. Importantly, the derived multiFRACTAL parameters,

namely, the generalized Hurst exponent and the width of the singularity spectrum,

derived via multiFRACTAL detrended fluctuation analysis, shows interesting

differences between tissues having different grades of precancers. The

refractive-index fluctuations are found to be more anticorrelated, and the

strength of multiFRACTALity is observed to be considerably stronger in the higher

grades of precancers. These observations on the multiFRACTAL nature of tissue

refractive-index variations may prove to be valuable for developing

light-scattering approaches for noninvasive diagnosis of precancer and

early-stage cancer.

PMID: 23454965 [PubMed - indexed for MEDLINE]

39. Oftalmologia. 2012;56(2):63-71.

[MultiFRACTAL characterisation of human retinal blood vessels].

[Article in Romanian]

Tălu S.

Author information:

Facultatea de Mecanică, Disciplina De Geometrie Descriptivă şi Grafică

Inginerească, Universitatea Tehnică Din Cluj-Napoca. STEFAN_TA@YAHOO.COM

PURPOSE: The objective of this study is to describe the microvascular network of

the normal human retina the d using multiFRACTAL geometry.

MATERIALS AND METHOD: The multiFRACTAL analysis of five digitized retinal images

was made with the Image J software, exprf applying the standard box-counting


RESULTS: The human retinal microvascular network has a multiFRACTAL geometry. The

generalized FRACTAL dimensions Dq were expressed by the mean value and standard

deviation. A comparison with the data from studies performed in the

ophthalmologic literature was made.

CONCLUSIONS: MultiFRACTAL characterization of human retinal microvascular

network, as a non-invasive technique for the analysis of various aspects of

retinal vascular topography can be used as a potential marker for early detection

of patients with retinal diseases.

PMID: 23424766 [PubMed - indexed for MEDLINE]

40. Phys Rev E Stat Nonlin Soft Matter Phys. 2013 Jan;87(1):012921. Epub 2013 Jan 31.

Relationships of exponents in two-dimensional multiFRACTAL detrended fluctuation


Zhou Y(1), Leung Y, Yu ZG.

Author information:

(1)Department of Geography and Resource Management, The Chinese University of Hong

Kong, Hong Kong, China.

MultiFRACTAL detrended fluctuation analysis (MF-DFA) is a generalization of the

conventional multiFRACTAL analysis. It is extended from the detrended fluctuation

analysis (DFA) which is developed for the purpose of detecting long-range

correlation and FRACTAL property in stationary and nonstationary time series. The

MF-DFA and some corresponding relationships of the exponents have been

subsequently extended to the two-dimensional space. We reexamine two extended

relationships in this study and demonstrate that: (i) The invalidity of the

relationship h(q)≡H for two-dimensional fractional Brownian motion, and h(q=2)≡H

between the Hurst exponent H and the generalized Hurst exponent h(q) in the

two-dimensional case. Two more logical relationships are proposed instead as

h(q=2)=H for the stationary surface and h(q=2)=H+2 for the nonstationary signal.

(ii) The invalidity of the expression τ(q)=qh(q)-D(f) stipulating the

relationship between the standard partition-function-based multiFRACTAL exponent

τ(q) and the generalized Hurst exponent h(q) in the two-dimensional case. Reasons

for its invalidity are given from two perspectives.

PMID: 23410418 [PubMed - indexed for MEDLINE]

41. Med Phys. 2013 Feb;40(2):020702. doi: 10.1118/1.4774362.

MultiFRACTAL analysis of laser Doppler flowmetry signals before and after

arm-cranking exercise in an older healthy population.

Klonizakis M(1), Humeau-Heurtier A.

Author information:

(1)School of Postgraduate Medicine, University of Hertfordshire, Hatfield, UK.

PURPOSE: There is a lot of speculation about the role of nitric-oxide (NO) in the

improvement usually noticed in microcirculatory function, following exercise. The

knowledge of the underlying mechanisms leading to such an improvement is

important as it may help in targeting and implementing therapies for

microcirculatory diseases. Through a laser Doppler flowmetry (LDF) signal

processing study, the authors' goal is to compare multiFRACTAL spectra of LDF

data recorded in both lower leg and forearm, during different exercise

conditions, in an older, untrained but healthy population.

METHODS: Using the method suggested by Halsey et al. [Phys. Rev. A 33, 1141-1151

(1986)], multiFRACTAL spectra of LDF signals recorded on lower leg and forearm

before and after exercise (arm-cranking), before and after acetylcholine (ACh)

iontophoresis, were determined on scales in relation with the NO-dependent

endothelial activity. The width of each multiFRACTAL spectrum was then computed

through the maximum and minimum Hölder exponent values for which the

multiFRACTAL spectrum reaches its minimal values. The results were then compared.

RESULTS: Following exercise and on the scales studied, the average width of the

multiFRACTAL spectra in both lower leg and forearm does not vary significantly

before and after ACh iontophoresis. Similarly, following ACh iontophoresis and

exercise, the average width of multiFRACTAL spectra remains statistically

unchanged, when compared to that measured prior to exercise, in both upper and

lower body, although negative trends can be observed.

CONCLUSIONS: For the authors' population and for the type of exercise that the

authors have chosen, the authors showed that the width of the multiFRACTAL

spectra of LDF signals does not change significantly on scales in relation with

the NO-dependent endothelial activity. Future studies may involve comparisons

with signals obtained in patient populations.

PMID: 23387723 [PubMed - indexed for MEDLINE]

42. Front Cell Neurosci. 2013 Jan 30;7:3. doi: 10.3389/fncel.2013.00003. eCollection


Quantitating the subtleties of microglial morphology with FRACTAL analysis.

Karperien A(1), Ahammer H, Jelinek HF.

Author information:

(1)Centre for Research in Complex Systems, School of Community Health, Charles Sturt

University Albury, NSW, Australia.

It is well established that microglial form and function are inextricably linked.

In recent years, the traditional view that microglial form ranges between

"ramified resting" and "activated amoeboid" has been emphasized through advancing

imaging techniques that point to microglial form being highly dynamic even within

the currently accepted morphological categories. Moreover, microglia adopt

meaningful intermediate forms between categories, with considerable crossover in

function and varying morphologies as they cycle, migrate, wave, phagocytose, and

extend and retract fine and gross processes. From a quantitative perspective, it

is problematic to measure such variability using traditional methods, but one way

of quantitating such detail is through FRACTAL analysis. The techniques of

FRACTAL analysis have been used for quantitating microglial morphology, to

categorize gross differences but also to differentiate subtle differences (e.g.,

amongst ramified cells). MultiFRACTAL analysis in particular is one technique of

FRACTAL analysis that may be useful for identifying intermediate forms. Here we

review current trends and methods of FRACTAL analysis, focusing on box counting

analysis, including lacunarity and multiFRACTAL analysis, as applied to

microglial morphology.

PMCID: PMC3558688

PMID: 23386810 [PubMed]

43. IEEE Trans Neural Syst Rehabil Eng. 2013 Mar;21(2):225-32. doi:

10.1109/TNSRE.2012.2236576. Epub 2013 Jan 9.

Real-time mental arithmetic task recognition from EEG signals.

Wang Q(1), Sourina O.

Author information:

(1)School of Electrical and Electronic Engineering, and Institute for Media

Innovation, Nanyang Technological University, 639798, Singapore.

Electroencephalography (EEG)-based monitoring the state of the user's brain

functioning and giving her/him the visual/audio/tactile feedback is called

neurofeedback technique, and it could allow the user to train the corresponding

brain functions. It could provide an alternative way of treatment for some

psychological disorders such as attention deficit hyperactivity disorder (ADHD),

where concentration function deficit exists, autism spectrum disorder (ASD), or

dyscalculia where the difficulty in learning and comprehending the arithmetic

exists. In this paper, a novel method for multiFRACTAL analysis of EEG signals

named generalized Higuchi FRACTAL dimension spectrum (GHFDS) was proposed and

applied in mental arithmetic task recognition from EEG signals. Other features

such as power spectrum density (PSD), autoregressive model (AR), and statistical

features were analyzed as well. The usage of the proposed FRACTAL dimension

spectrum of EEG signal in combination with other features improved the mental

arithmetic task recognition accuracy in both multi-channel and one-channel

subject-dependent algorithms up to 97.87% and 84.15% correspondingly. Based on

the channel ranking, four channels were chosen which gave the accuracy up to

97.11%. Reliable real-time neurofeedback system could be implemented based on the

algorithms proposed in this paper.

PMID: 23314778 [PubMed - indexed for MEDLINE]

44. J Theor Biol. 2013 Mar 21;321:54-62. doi: 10.1016/j.jtbi.2012.12.027. Epub 2013

Jan 10.

Investigation on series of length of coding and non-coding DNA sequences of

bacteria using multiFRACTAL detrended cross-correlation analysis.

Stan C(1), Cristescu MT, Luiza BI, Cristescu CP.

Author information:

(1)Department of Physics, Faculty of Applied Sciences, Politehnica University of

Bucharest, Bucharest, Romania.

In the framework of multiFRACTAL detrended cross-correlation analysis, we

investigate characteristics of series of length of coding and non-coding DNA

sequences of some bacteria and archaea. We propose the use of a multiFRACTAL

cross-correlation series that can be defined for any pair of equal lengths data

sequences (or time series) and that can be characterized by the full set of

parameters that are attributed to any time series. Comparison between

characteristics of series of length of coding and non-coding DNA sequences and of

their associated multiFRACTAL cross-correlation series for selected groups is

used for the identification of class affiliation of certain bacteria and archaea.

The analysis is carried out using the dependence of the generalized Hurst

exponent on the size of fluctuations, the shape of the singularity spectra, the

shape and relative disposition of the curves of the singular measures scaling

exponent and the values of the associated parameters. Empirically, we demonstrate

that the series of lengths of coding and non-coding sequences as well as the

associated multiFRACTAL cross-correlation series can be approximated as universal


Copyright © 2013 Elsevier Ltd. All rights reserved.

PMID: 23313335 [PubMed - indexed for MEDLINE]

45. Ann Biomed Eng. 2013 Aug;41(8):1635-45. doi: 10.1007/s10439-012-0724-z. Epub 2012

Dec 18.

Identifying multiplicative interactions between temporal scales of human movement


Ihlen EA(1), Vereijken B.

Author information:

(1)Department of Neuroscience, Norwegian University of Science and Technology,

Trondheim, Norway.

Conventional scaling analyses of human movement variability such as detrended

fluctuation analyses assume that the movement variable can be decomposed into

scale-dependent variation. However, these conventional scaling analyses are

insensitive to multiplicative interactions within the movement variable.

Multiplicative interactions refer to couplings between the scale-dependent

variations across multiple scales that generate intermittent changes in the human

movement variable. The mathematical concept for intermittent variability

generated by multiplicative interactions is called multiFRACTAL variability.

MultiFRACTAL variability is numerically defined by a spectrum of scaling

exponents (i.e., a multiFRACTAL spectrum) that can be an important feature of

coordinated movements and, consequently, relevant when aiming to identify

movement disorders. In the current study, a new method is introduced based on

detrended fluctuation analysis that can identify the multiFRACTAL spectrum from

the temporal variation of local scaling exponents. The influence of

multiplicative interactions on the local scaling exponents is tested by a Monte

Carlo surrogate test. The methods are validated on multiplicative cascading

processes with known multiplicative interactions. The application of the new

methods is subsequently illustrated on an example of centre of pressure

variations during quiet and relaxed standing. The results show that

multiplicative interactions are present during periods with large movements of

the center of gravity, where the movements of the centre of gravity and centre of

pressure couple into coordinative structures. Further application and

interpretation of the developed method for the study of human movement

variability are discussed.

PMID: 23247986 [PubMed - indexed for MEDLINE]

46. Meat Sci. 2013 Mar;93(3):723-32. doi: 10.1016/j.meatsci.2012.11.015. Epub 2012

Nov 16.

MultiFRACTAL analysis application to the characterization of fatty infiltration

in Iberian and White pork sirloins.

Serrano S(1), Perán F, Jiménez-Hornero FJ, Gutiérrez de Ravé E.

Author information:

(1)Department of Food Hygiene and Technology, University of Cordoba, Campus

Rabanales, Edif. Darwin, anexo, Cordoba 14071, Spain.

This paper applies the multiFRACTAL analysis based on the sandbox method to

describe the distribution of fatty infiltration in Iberian and White pork meat

with the aim of characterization and classification. This work was carried out by

making photographs of sirloin cuts of both breeds and then treated with image

analysis software. The obtained image data were stored in text format and

constituted the input for multiFRACTAL analysis. The results obtained show that

pork sirloin connective fatty tissue exhibits a multiFRACTAL type of scaling.

Significant correlations were found between some of the parameters governing the

multiFRACTAL behavior and fat percentage, especially in the case of Iberian

sirloin. The differences found for the relationships between the generalized

FRACTAL dimensions and fat percentage provide information for the categorization

of the studied meat pieces.

Copyright © 2012 Elsevier Ltd. All rights reserved.

PMID: 23247059 [PubMed - indexed for MEDLINE]

47. Front Physiol. 2012 Nov 15;3:417. doi: 10.3389/fphys.2012.00417. eCollection


Pitfalls in FRACTAL Time Series Analysis: fMRI BOLD as an Exemplary Case.

Eke A(1), Herman P, Sanganahalli BG, Hyder F, Mukli P, Nagy Z.

Author information:

(1)Institute of Human Physiology and Clinical Experimental Research, Semmelweis

University Budapest, Hungary ; Diagnostic Radiology, Yale University New Haven,


This article will be positioned on our previous work demonstrating the importance

of adhering to a carefully selected set of criteria when choosing the suitable

method from those available ensuring its adequate performance when applied to

real temporal signals, such as fMRI BOLD, to evaluate one important facet of

their behavior, FRACTALity. Earlier, we have reviewed on a range of monoFRACTAL

tools and evaluated their performance. Given the advance in the FRACTAL field, in

this article we will discuss the most widely used implementations of multiFRACTAL

analyses, too. Our recommended flowchart for the FRACTAL characterization of

spontaneous, low frequency fluctuations in fMRI BOLD will be used as the

framework for this article to make certain that it will provide a hands-on

experience for the reader in handling the perplexed issues of FRACTAL analysis.

The reason why this particular signal modality and its FRACTAL analysis has been

chosen was due to its high impact on today's neuroscience given it had powerfully

emerged as a new way of interpreting the complex functioning of the brain (see

"intrinsic activity"). The reader will first be presented with the basic concepts

of mono and multiFRACTAL time series analyses, followed by some of the most

relevant implementations, characterization by numerical approaches. The notion of

the dichotomy of fractional Gaussian noise and fractional Brownian motion signal

classes and their impact on FRACTAL time series analyses will be thoroughly

discussed as the central theme of our application strategy. Sources of pitfalls

and way how to avoid them will be identified followed by a demonstration on

FRACTAL studies of fMRI BOLD taken from the literature and that of our own in an

attempt to consolidate the best practice in FRACTAL analysis of empirical fMRI

BOLD signals mapped throughout the brain as an exemplary case of potentially wide


PMCID: PMC3513686

PMID: 23227008 [PubMed]

48. Med Eng Phys. 2013 Aug;35(8):1070-8. doi: 10.1016/j.medengphy.2012.11.004. Epub

2012 Nov 30.

Complexity of the autonomic heart rate control in coronary artery occlusion in

patients with and without prior myocardial infarction.

Magrans R(1), Gomis P, Caminal P, Wagner GS.

Author information:

(1)Departament d'Enginyeria de Sistemas, Automàtica i Informàtica Industrial,

Universitat Politècnica de Catalunya, Barcelona, Spain.

Autonomic nervous system (ANS) is governed by complex interactions arising from

feedback loops of nonlinear systems that operate over a wide range of temporal

and spatial scales, enabling the organism to adapt to stress, metabolic changes

and diseases. This study is aimed to assess multiFRACTAL and nonlinear

characteristics of the ANS during ischemic events provoked by a prolonged

percutaneous coronary intervention (PCI) procedure. Eighty-seven patients from

the STAFF III database were used. Patients were classified into 2 groups: (1)

with prior myocardial infarction (MI) and (2) without MI (noMI). R-R signals

during three 3-min stages of the procedures were analyzed using multiFRACTAL and

surrogate data techniques. MultiFRACTAL indices increased significantly from the

pre-inflation stage to the post-deflation stage. These variations were more

marked for the noMI group. MultiFRACTAL changes significantly correlated with

both the decreased parasympathetic and the increased sympathetic modulations

accounted by classical linear indices. MultiFRACTAL measures resulted to be a

more powerful indicator than linear HRV indices in quantifying the

ischemia-induced changes. Right coronary artery (RCA) occlusions provoke greater

multiFRACTAL reactions throughout the PCI procedure. Our findings suggest reduced

complex multiFRACTAL and nonlinear reactions of ANS activity in patients with

prior MI in comparison to the noMI group, possibly due to degradation in the

complexity of control mechanism of heart rate generation.

Copyright © 2012 IPEM. Published by Elsevier Ltd. All rights reserved.

PMID: 23201277 [PubMed - indexed for MEDLINE]

49. Comput Med Imaging Graph. 2013 Jan;37(1):61-71. doi:

10.1016/j.compmedimag.2012.10.001. Epub 2012 Nov 9.

Computational grading of hepatocellular carcinoma using multiFRACTAL feature


Atupelage C(1), Nagahashi H, Yamaguchi M, Abe T, Hashiguchi A, Sakamoto M.

Author information:

(1)Department of Computational Intelligence and Systems Science, Tokyo Institute of

Technology, Japan.

Cancer grading has become an important topic in the field of image

interpretation-based computer aided diagnosis systems. This paper proposes a

novel feature descriptor to observe the characteristics of histopathological

textures in a discriminative manner. The proposed feature descriptor utilizes

FRACTAL geometric analysis with four multiFRACTAL measures to construct an eight

dimensional feature space. The proposed method employed a bag-of-feature-based

classification model to discriminate a set of hepatocellular carcinoma images

into five categories according to Edmondson and Steiner's grading system. Three

feature selection methods were utilized to obtain the most discriminative

features of codeword dictionary (codebook). Furthermore, we incorporated four

other textural feature descriptors: Gabor-filters, LM-filters, local binary

patterns, and Haralick, to obtain a benchmark of the accuracy of the

classification. Two experiments were performed: (i) classifying non-neoplastic

tissues and tumors and (ii) grading the hepatocellular carcinoma images into five

classes. Experimental results indicated the significance of the multiFRACTAL

features for describing the histopathological image texture because it

outperformed other four feature descriptors. We graded a given ROI image by

defining a threshold-based majority-voting rule and obtained an average correct

classification rate around 95% for five classes classification.

Copyright © 2012 Elsevier Ltd. All rights reserved.

PMID: 23141965 [PubMed - indexed for MEDLINE]

50. Med Biol Eng Comput. 2013 Mar;51(3):277-84. doi: 10.1007/s11517-012-0990-9. Epub

2012 Nov 7.

Classification of surface electromyographic signals by means of multiFRACTAL

singularity spectrum.

Wang G(1), Ren D.

Author information:

(1)Key Laboratory of Biomedical Information Engineering of Ministry of Education,

Institute of Biomedical Engineering, School of Life Science and Technology, Xi'an

Jiaotong University, 28 Xianning West Road, Xi'an, 710049, China.

In order to effectively control a prosthetic system, considerable attempts have

been made in recent years to improve the classification accuracy of surface

electromyographic (SEMG) signals. However, the extraction of effective features

is still a primary challenge for the classification of SEMG signals. This study

tried to solve the problem by applying the multiFRACTAL analysis. It was found

that the SEMG signals were characterized by multiFRACTALity during forearm

movements and different types of forearm movements were related to different

multiFRACTAL singularity spectra. To quantitatively evaluate the multiFRACTAL

singularity spectra of the SEMG signals, the areas of the singularity spectrum

curves were calculated by integrating the spectrum curves with respect to the

singularity strengths. Our results showed that there were several separate

clusters resulting from singularity spectrum areas of different forearm movements

when two channels of SEMG signals were used in this experimental research, which

demonstrated that the multiFRACTAL analysis approach was suitable for identifying

different types of forearm movements. By comparing with other feature extraction

techniques, the multiFRACTAL singularity spectrum approach provided higher

classification accuracy in terms of the classification of SEMG signals.

PMID: 23132526 [PubMed - indexed for MEDLINE]

51. Med Phys. 2012 Oct;39(10):5849-56. doi: 10.1118/1.4748506.

Laser speckle contrast imaging: multiFRACTAL analysis of data recorded in healthy


Humeau-Heurtier A(1), Mahe G, Durand S, Henrion D, Abraham P.

Author information:

(1)Université d'Angers, Laboratoire d'Ingénierie des Systèmes Automatisés,

Angers, France.

PURPOSE: The monitoring of microvascular blood flow can now be performed with

laser speckle contrast imaging (LSCI), a new noninvasive laser-based technique.

LSCI images have good spatial and temporal resolutions. Nevertheless, from now,

few processing of these data have been performed to have a better knowledge on

their properties. We herein propose a multiFRACTAL analysis of LSCI data recorded

in the forearm of healthy subjects, based on the method from Halsey et al., one

of the popular methods using the box-counting technique.

METHODS: In laser speckle contrast image time sequences, we studied time

evolution of pixel values, as well as time evolution of pixel values averaged in

regions of interest (ROI) of different sizes. The results are compared with the

ones obtained with single-point laser Doppler flowmetry (LDF) signals recorded

simultaneously to LSCI images.

RESULTS: Our work shows that, for the range of scales studied and with the method

from Halsey et al., time evolution of pixel values present narrow multiFRACTAL

spectra, reminding the ones of monoFRACTAL data. However, we observe that when

LSCI pixel values are averaged in ROI large enough and followed with time, the

multiFRACTAL spectra become larger and closer to the ones of LDF signals.

CONCLUSIONS: Single pixels from laser speckle contrast images may not possess the

same multiFRACTAL properties as LDF signals. These findings could now be compared

with the ones obtained with other ranges of scales and with data recorded from

pathological subjects.

PMID: 23039623 [PubMed - indexed for MEDLINE]

52. Phys Rev E Stat Nonlin Soft Matter Phys. 2012 Jul;86(1 Pt 1):011924. Epub 2012

Jul 24.

MultiFRACTAL dynamics of turbulent flows in swimming bacterial suspensions.

Liu KA(1), I L.

Author information:

(1)Department of Physics and Center for Complex Systems, National Central

University, Jhongli, Taiwan 32001, Republic of China.

Erratum in

Phys Rev E Stat Nonlin Soft Matter Phys. 2012 Oct;86(4 Pt 2):049905.

We experimentally investigate the self-propelled two-dimensional turbulent flows

of Escherichia coli suspensions in thin liquid films at two different cell

concentrations. It is found that the flow has fluctuating vortices with a broad

range of scales and intensities through the nonlinear interaction of the swimming

bacteria. Increasing cell concentration increases the total propelling power and

the nonlinear interaction. It causes the generation of vortices with larger

scale, lower frequency, and higher intensity. It also widens the histograms of

the flow velocity and the velocity increment between two spatially separated

points with more stretched non-Gaussian tails. From the scaling analysis of the

structure function S(q)(r) of the qth moment of the velocity increment between

two points with spatial separation r, nonlinear relations between the scaling

exponent ζ(q) of S(q)(r) and q are found for both cell concentrations, which

manifests the multiFRACTAL dynamics. The multiFRACTALity can be enhanced by

increasing cell concentration.

PMID: 23005469 [PubMed - indexed for MEDLINE]

53. J Neural Eng. 2012 Oct;9(5):056008. Epub 2012 Aug 28.

Complexity and multiFRACTALity of neuronal noise in mouse and human hippocampal

epileptiform dynamics.

Serletis D(1), Bardakjian BL, Valiante TA, Carlen PL.

Author information:

(1)Neurological Institute, Epilepsy Center, Cleveland Clinic, OH 44195, USA.

FRACTAL methods offer an invaluable means of investigating turbulent nonlinearity

in non-stationary biomedical recordings from the brain. Here, we investigate

properties of complexity (i.e. the correlation dimension, maximum Lyapunov

exponent, 1/f(γ) noise and approximate entropy) and multiFRACTALity in background

neuronal noise-like activity underlying epileptiform transitions recorded at the

intracellular and local network scales from two in vitro models: the whole-intact

mouse hippocampus and lesional human hippocampal slices. Our results show

evidence for reduced dynamical complexity and multiFRACTAL signal features

following transition to the ictal epileptiform state. These findings suggest that

pathological breakdown in multiFRACTAL complexity coincides with loss of signal

variability or heterogeneity, consistent with an unhealthy ictal state that is

far from the equilibrium of turbulent yet healthy FRACTAL dynamics in the brain.

Thus, it appears that background noise-like activity successfully captures

complex and multiFRACTAL signal features that may, at least in part, be used to

classify and identify brain state transitions in the healthy and epileptic brain,

offering potential promise for therapeutic neuromodulatory strategies for

afflicted patients suffering from epilepsy and other related neurological


PMID: 22929878 [PubMed - indexed for MEDLINE]

54. Anesthesiology. 2012 Oct;117(4):810-21.

MultiFRACTAL analysis of hemodynamic behavior: intraoperative instability and its

pharmacological manipulation.

Bishop SM(1), Yarham SI, Navapurkar VU, Menon DK, Ercole A.

Author information:

(1)Division of Anaesthesia, School of Clinical Medicine, University of Cambridge,

Cambridge, United Kingdom.

Comment in

Anesthesiology. 2012 Oct;117(4):699-700.

BACKGROUND: Physiologic instability is a common clinical problem in the

critically ill. Many natural feedback systems are nonlinear, and seemingly random

fluctuations may result from the amplification of external perturbations or even

arise de novo as a consequence of their underlying dynamics. Characterization of

the underlying nonlinear state may be of clinical importance, providing a

technique to monitor complex physiology in real-time, guiding patient care and

improving outcomes.

METHODS: We employ the wavelet modulus maxima technique to characterize the

multiFRACTAL properties of heart rate and mean arterial pressure physiology

retrospectively for four patients during open abdominal aortic aneurysm repair.

We calculated point-estimates for the dominant Hölder exponent (hm, hm) and

multiFRACTAL spectrum width-at-half-height for both heart rate and mean arterial

pressure signals. We investigated how these parameters changed with the

administration of an intravenous vasoconstrictor and examined how this varied

with atropine pretreatment.

RESULTS: Hypotensive patients showed lower values of hm, consistent with a more

highly fluctuating and complex behavior. Treatment with a vasoconstrictor led to

a transient increase in hm, revealing the appearance of longer-range

correlations, but did not impact hm. On the other hand, prior treatment with

atropine had no effect on hm behavior but did tend to increase hm.

CONCLUSIONS: Hypotension leads to a reduction in dominant Hölder exponents for

mean arterial pressure, demonstrating an increasing signal complexity consistent

with the activation of important homeokinetic processes. Conversely,

pharmacological interventions may also alter the underlying dynamics.

Pharmacological restoration of homeostasis leads to system decomplexification,

suggesting that homeokinetic mechanisms are derecruited as homeostasis is


PMID: 22913922 [PubMed - indexed for MEDLINE]

55. IEEE Trans Image Process. 2013 Jan;22(1):286-99. doi: 10.1109/TIP.2012.2214040.

Epub 2012 Aug 17.

Wavelet domain multiFRACTAL analysis for static and dynamic texture


Ji H(1), Yang X, Ling H, Xu Y.

Author information:

(1)Department of Mathematics, National University of Singapore, Singapore.

In this paper, we propose a new texture descriptor for both static and dynamic

textures. The new descriptor is built on the wavelet-based spatial-frequency

analysis of two complementary wavelet pyramids: standard multiscale and wavelet

leader. These wavelet pyramids essentially capture the local texture responses in

multiple high-pass channels in a multiscale and multiorientation fashion, in

which there exists a strong power-law relationship for natural images. Such a

power-law relationship is characterized by the so-called multiFRACTAL analysis.

In addition, two more techniques, scale normalization and multiorientation image

averaging, are introduced to further improve the robustness of the proposed

descriptor. Combining these techniques, the proposed descriptor enjoys both high

discriminative power and robustness against many environmental changes. We apply

the descriptor for classifying both static and dynamic textures. Our method has

demonstrated excellent performance in comparison with the state-of-the-art

approaches in several public benchmark datasets.

PMID: 22910109 [PubMed - indexed for MEDLINE]

56. Izv Akad Nauk Ser Biol. 2012 May-Jun;(3):327-35.

[MultiFRACTAL analysis of the species structure of freshwater hydrobiocenoses].

[Article in Russian]

Gelashvili DB, Iudin DI, Iakimov VN, Solntsev LA, Rozenberg GS, Shurganova GV,

Okhapkin AG, Startseva NA, Pukhnarevich DA, Snegireva MS.

The principles and methods of FRACTAL analysis of the species structure of

freshwater phytoplankton, zooplankton, and macrozoobenthos communities of plain

water reservoirs and urban waterbodies are discussed. The theoretical foundation

and experimental verification are provided for the authors' concept of

self-similar (quasi-FRACTAL) nature of the species structure of communities.

According to this concept, the adequate mathematical image of species richness

accumulation with growing sampling effort is quasi-monoFRACTALs, while the

generalized geometric image of the species structure of the community is a

multiFRACTAL spectrum.

PMID: 22834317 [PubMed - indexed for MEDLINE]

57. Fiziol Cheloveka. 2012 May-Jun;38(3):30-6.

[FRACTAL characteristics of the functional state of the brain in patients with

anxious phobic disorders].

[Article in Russian]

Dik OE, Sviatogor IA, Ishinova VA, Nozdrachev AD.

The task of estimation of the functional state of the human brain during

psychotherapeutic treatment of psychogenic pain in patients with anxious phobic

disorders is examined. For solving the task the methods of spectral and

multiFRACTAL analyses of EEG fragments are applied during the perception of

psychogenic pain and its removal by the psychorelaxation technique. Contrary to

power spectra singularity spectra allow to distinguish EEGs quanitatively in the

examined functional states of the human brain. The pain suppression in patients

with anxious phobic disorders during psychorelaxation is accompanied by changing

the width of the singularity spectrum and approximation of this multiFRACTAL

partameter to the value corresponding to a healthy subject.

PMID: 22830241 [PubMed - indexed for MEDLINE]

58. PLoS One. 2012;7(7):e41148. doi: 10.1371/journal.pone.0041148. Epub 2012 Jul 18.

FRACTAL dimension and vessel complexity in patients with cerebral arteriovenous


Reishofer G(1), Koschutnig K, Enzinger C, Ebner F, Ahammer H.

Author information:

(1)Department of Radiology, MR-Physics, Medical University of Graz, Graz, Austria.

The FRACTAL dimension (FD) can be used as a measure for morphological complexity

in biological systems. The aim of this study was to test the usefulness of this

quantitative parameter in the context of cerebral vascular complexity. FRACTAL

analysis was applied on ten patients with cerebral arteriovenous malformations

(AVM) and ten healthy controls. Maximum intensity projections from Time-of-Flight

MRI scans were analyzed using different measurements of FD, the Box-counting

dimension, the Minkowski dimension and generalized dimensions evaluated by means

of multiFRACTAL analysis. The physiological significance of this parameter was

investigated by comparing values of FD first, with the maximum slope of contrast

media transit obtained from dynamic contrast-enhanced MRI data and second, with

the nidus size obtained from X-ray angiography data. We found that for all

methods, the Box-counting dimension, the Minkowski dimension and the generalized

dimensions FD was significantly higher in the hemisphere with AVM compared to the

hemisphere without AVM indicating that FD is a sensitive parameter to capture

vascular complexity. Furthermore we found a high correlation between FD and the

maximum slope of contrast media transit and between FD and the size of the

central nidus pointing out the physiological relevance of FD. The proposed method

may therefore serve as an additional objective parameter, which can be assessed

automatically and might assist in the complex workup of AVMs.

PMCID: PMC3399805

PMID: 22815946 [PubMed - indexed for MEDLINE]

59. PLoS One. 2012;7(7):e40693. doi: 10.1371/journal.pone.0040693. Epub 2012 Jul 17.

Evidence of multiFRACTALity from emerging European stock markets.

Caraiani P.

Author information:

Institute for Economic Forecasting, Romanian Academy, Bucharest, Romania.

We test for the presence of multiFRACTALity in the daily returns of the three

most important stock market indices from Central and Eastern Europe, Czech PX,

Hungarian BUX and Polish WIG using the Empirical Mode Decomposition based

MultiFRACTAL Detrended Fluctuation Analysis. We found that the global Hurst

coefficient varies with the q coefficient and that there is multiFRACTALity

evidenced through the multiFRACTAL spectrum. The exercise is replicated for the

sample around the high volatility period corresponding to the last global

financial crisis. Although no direct link has been found between the crisis and

the multiFRACTAL spectrum, the crisis was found to influence the overall shape as

quantified through the norm of the multiFRACTAL spectrum.

PMCID: PMC3398935

PMID: 22815792 [PubMed - indexed for MEDLINE]

60. Phys Rev E Stat Nonlin Soft Matter Phys. 2012 Apr;85(4 Pt 2):046208. Epub 2012

Apr 12.

MultiFRACTAL dimensions for all moments for certain critical random-matrix

ensembles in the strong multiFRACTALity regime.

Bogomolny E(1), Giraud O.

Author information:

(1)Université Paris-Sud, CNRS, LPTMS, UMR8626, Orsay, France.

We construct perturbation series for the qth moment of eigenfunctions of various

critical random-matrix ensembles in the strong multiFRACTALity regime close to

localization. Contrary to previous investigations, our results are valid in the

region q<1/2. Our findings allow one to verify, at first leading orders in the

strong multiFRACTALity limit, the symmetry relation for anomalous FRACTAL

dimensions Δ(q)=Δ(1-q), recently conjectured for critical models where an analog

of the metal-insulator transition takes place. It is known that this relation is

verified at leading order in the weak multiFRACTALity regime. Our results thus

indicate that this symmetry holds in both limits of small and large coupling

constant. For general values of the coupling constant we present careful

numerical verifications of this symmetry relation for different critical

random-matrix ensembles. We also present an example of a system closely related

to one of these critical ensembles, but where the symmetry relation, at least

numerically, is not fulfilled.

PMID: 22680557 [PubMed]

61. Front Physiol. 2012 Jun 4;3:141. doi: 10.3389/fphys.2012.00141. eCollection 2012.

Introduction to multiFRACTAL detrended fluctuation analysis in matlab.

Ihlen EA.

Author information:

Department of Neuroscience, Norwegian University of Science and Technology

Trondheim, Norway.

FRACTAL structures are found in biomedical time series from a wide range of

physiological phenomena. The multiFRACTAL spectrum identifies the deviations in

FRACTAL structure within time periods with large and small fluctuations. The

present tutorial is an introduction to multiFRACTAL detrended fluctuation

analysis (MFDFA) that estimates the multiFRACTAL spectrum of biomedical time

series. The tutorial presents MFDFA step-by-step in an interactive Matlab

session. All Matlab tools needed are available in Introduction to MFDFA folder at

the website MFDFA are introduced in Matlab code

boxes where the reader can employ pieces of, or the entire MFDFA to example time

series. After introducing MFDFA, the tutorial discusses the best practice of

MFDFA in biomedical signal processing. The main aim of the tutorial is to give

the reader a simple self-sustained guide to the implementation of MFDFA and

interpretation of the resulting multiFRACTAL spectra.

PMCID: PMC3366552

PMID: 22675302 [PubMed]

62. Microcirculation. 2012 Oct;19(7):652-63. doi: 10.1111/j.1549-8719.2012.00200.x.

Skin graft vascular maturation and remodeling: a multiFRACTAL approach to

morphological quantification.

Gould DJ(1), Reece GP.

Author information:

(1)Medical Scientist Training Program, Baylor College of Medicine, Houston, Texas

77030, USA.

OBJECTIVE: One important contributor to tissue graft viability is angiogenic

maturation of the graft tissue bed. This study uses scale-invariant microvascular

morphological quantification to track vessel maturation and remodeling in a

split-thickness skin-grafting model over 21 days, comparing the results to

classical techniques.

METHODS: Images from a previous study of split-thickness skin grafting in rats

were analyzed. Microvascular morphology (FRACTAL and multiFRACTAL dimensions,

lacunarity, and vessel density) within fibrin interfaces of samples over time was

quantified using classical semi-automated methods and automated multiFRACTAL and

lacunarity analyses.

RESULTS: Microvessel morphology increased in density and complexity, from three

to seven days after engraftment and then regressed by 21 days. Vessel density

increased from 0.07 on day 3 to 0.20 on day 7 and then decreased to 0.06 on day

21. A similar trend was seen for the FRACTAL dimension that increased from 1.56

at three days to 1.77 at seven days then decreased to 1.57 by 21 days. Vessel

diameters did not change whereas complexity and density did, signaling


CONCLUSIONS: This new automated analysis identified design parameters for tissue

engraftment and could be used in other models of graft vessel biology to track

proliferation and pruning of complex vessel beds.

© 2012 John Wiley & Sons Ltd.

PMCID: PMC3467318

PMID: 22672367 [PubMed - indexed for MEDLINE]

63. Anal Quant Cytol Histol. 2012 Apr;34(2):105-8.

MultiFRACTAL spectrum differentiation of well-differentiated adenocarcinoma from

complex atypical hyperplasia of the uterus.

Barwad A(1), Dey P.

Author information:

(1)Department of Cytopathology and Gynecological Pathology, Postgraduate Institute

of Medical Education and Research, Chandigarh, India.

OBJECTIVE: To introduce a new field of multiFRACTAL spectrum in distinguishing

between endometrial well-differentiated adenocarcinoma (WDAC) and complex

atypical hyperplasia (CAH).

STUDY DESIGN: Thirteen cases of CAH and 16 of WDAC were selected from radical

hysterectomy specimens, and multiFRACTAL spectrum was measured from at least 4-5

representative digitized images of each case. The data were collected from

f(alpha) vs. alpha curves. The values of alpha max, alpha min, and their

difference delta alpha (alpha max-alpha min) were recorded and the data compared.

RESULTS: The mean +/- SD of alpha max, alpha min, and delta alpha of CAH were

2.36357 +/- 0.111623, 1.71357 +/- 0.032160, and 0.64214 +/- 0.094248,

respectively. The mean +/- SD of alpha max, alpha min, and delta of WDAC were

2.50640 +/- 0.104545, 1.72100 +/- 0.036436, and 0.77620 +/- 0.108268,

respectively. The mean of alpha max, alpha min, and delta alpha of WDAC were

higher than in CAH. Mann-Whitney U test showed significant difference (p <

0.0001) of alpha max and delta alpja of WDAC and CAH.

CONCLUSION: MultiFRACTAL dimension is significantly different in WDAC and CAH.

The multiFRACTAL dimension is a new area in pathology. This study demonstrates

the potential usefulness of multiFRACTAL analysis in histopathology.

PMID: 22611766 [PubMed - indexed for MEDLINE]

64. Phys Rev E Stat Nonlin Soft Matter Phys. 2012 Mar;85(3 Pt 1):031142. Epub 2012

Mar 28.

Markov-switching multiFRACTAL models as another class of random-energy-like

models in one-dimensional space.

Saakian DB.

Author information:

Institute of Physics, Academia Sinica, Nankang, Taipei 11529, Taiwan.

We map the Markov-switching multiFRACTAL model (MSM) onto the random energy model

(REM). The MSM is, like the REM, an exactly solvable model in one-dimensional

space with nontrivial correlation functions. According to our results, four

different statistical physics phases are possible in random walks with

multiFRACTAL behavior. We also introduce the continuous branching version of the

model, calculate the moments, and prove multiscaling behavior. Different phases

have different multiscaling properties.

PMID: 22587073 [PubMed - indexed for MEDLINE]

65. Phys Rev E Stat Nonlin Soft Matter Phys. 2012 Mar;85(3 Pt 1):031113. Epub 2012

Mar 13.

Geometrical exponents of contour loops on synthetic multiFRACTAL rough surfaces:

multiplicative hierarchical cascade p model.

Hosseinabadi S(1), Rajabpour MA, Movahed MS, Allaei SM.

Author information:

(1)Department of Physics, Alzahra University, Tehran, Iran.

In this paper, we study many geometrical properties of contour loops to

characterize the morphology of synthetic multiFRACTAL rough surfaces, which are

generated by multiplicative hierarchical cascading processes. To this end, two

different classes of multiFRACTAL rough surfaces are numerically simulated. As

the first group, singular measure multiFRACTAL rough surfaces are generated by

using the p model. The smoothened multiFRACTAL rough surface then is simulated by

convolving the first group with a so-called Hurst exponent, H*. The generalized

multiFRACTAL dimension of isoheight lines (contours), D(q), correlation exponent

of contours, x(l), cumulative distributions of areas, ξ, and perimeters, η, are

calculated for both synthetic multiFRACTAL rough surfaces. Our results show that

for both mentioned classes, hyperscaling relations for contour loops are the same

as that of monoFRACTAL systems. In contrast to singular measure multiFRACTAL

rough surfaces, H* plays a leading role in smoothened multiFRACTAL rough

surfaces. All computed geometrical exponents for the first class depend not only

on its Hurst exponent but also on the set of p values. But in spite of

multiFRACTAL nature of smoothened surfaces (second class), the corresponding

geometrical exponents are controlled by H*, the same as what happens for

monoFRACTAL rough surfaces.

PMID: 22587044 [PubMed - indexed for MEDLINE]

66. Phys Rev Lett. 2012 Mar 30;108(13):134502. Epub 2012 Mar 28.

Distribution of particles and bubbles in turbulence at a small Stokes number.

Fouxon I.

Author information:

Raymond and Beverly Sackler School of Physics and Astronomy, Tel-Aviv University,

Tel-Aviv 69978, Israel.

The inertia of particles driven by the turbulent flow of the surrounding fluid

makes them prefer certain regions of the flow. The heavy particles lag behind the

flow and tend to accumulate in the regions with less vorticity, while the light

particles do the opposite. As a result of the long-time evolution, the particles

distribute over a multiFRACTAL attractor in space. We consider this distribution

using our recent results on the steady states of chaotic dynamics. We describe

the preferential concentration analytically and derive the correlation functions

of density and the FRACTAL dimensions of the attractor. The results are obtained

for real turbulence and are testable experimentally.

PMID: 22540704 [PubMed]

67. Phys Rev E Stat Nonlin Soft Matter Phys. 2012 Feb;85(2 Pt 1):021915. Epub 2012

Feb 17.

Multiscale multiFRACTAL analysis of heart rate variability recordings with a

large number of occurrences of arrhythmia.

Gierałtowski J(1), Żebrowski JJ, Baranowski R.

Author information:

(1)Faculty of Physics, Warsaw University of Technology, Warsaw, Poland.

Human heart rate variability, in the form of time series of intervals between

heart beats, shows complex, FRACTAL properties. Recently, it was demonstrated

many times that the FRACTAL properties vary from point to point along the series,

leading to multiFRACTALity. In this paper, we concentrate not only on the fact

that the human heart rate has multiFRACTAL properties but also that these

properties depend on the time scale in which the multiFRACTALity is measured.

This time scale is related to the frequency band of the signal. We find that

human heart rate variability appears to be far more complex than hitherto

reported in the studies using a fixed time scale. We introduce a method called

multiscale multiFRACTAL analysis (MMA), which allows us to extend the description

of heart rate variability to include the dependence on the magnitude of the

variability and time scale (or frequency band). MMA is relatively immune to

additive noise and nonstationarity, including the nonstationarity due to

inclusions into the time series of events of a different dynamics (e.g.,

arrhythmic events in sinus rhythm). The MMA method may provide new ways of

measuring the nonlinearity of a signal, and it may help to develop new methods of

medical diagnostics.

© 2012 American Physical Society

PMID: 22463252 [PubMed - indexed for MEDLINE]

68. Phys Rev E Stat Nonlin Soft Matter Phys. 2012 Feb;85(2 Pt 1):021407. Epub 2012

Feb 27.

MultiFRACTAL analysis of the branch structure of diffusion-limited aggregates.

Hanan WG(1), Heffernan DM.

Author information:

(1)Department of Mathematical Physics, National University of Ireland Maynooth,

County Kildare, Ireland.

We examine the branch structure of radial diffusion-limited aggregation (DLA)

clusters for evidence of multiFRACTALity. The lacunarity of DLA clusters is

measured and the generalized dimensions D(q) of their mass distribution is

estimated using the sandbox method. We find that the global n-fold symmetry of

the aggregates can induce anomalous scaling behavior into these measurements.

However, negating the effects of this symmetry, standard scaling is recovered.

© 2012 American Physical Society

PMID: 22463212 [PubMed - indexed for MEDLINE]

69. Comput Methods Programs Biomed. 2012 Oct;108(1):176-85. doi:

10.1016/j.cmpb.2012.02.014. Epub 2012 Mar 27.

MultiFRACTAL characterisation of electrocardiographic RR and QT time-series

before and after progressive exercise.

Lewis MJ(1), Short AL, Suckling J.

Author information:

(1)College of Engineering, Swansea University, Swansea, UK.

The scaling (FRACTAL) characteristics of electrocardiograms (ECG) provide

information complementary to traditional linear measurements (heart rate,

repolarisation rate etc.) allowing them to discriminate signal changes induced

pathologically or pharmacologically. Under such interventions scaling behaviour

is described by multiple local scaling exponents and the signal is termed

multiFRACTAL. Exercise testing is used extensively to quantify and monitor

cardiorespiratory health, yet to our knowledge there has been no previous

multiFRACTAL investigation of exercise-induced changes in heart rate dynamics.

Ambulatory ECGs were acquired from eight healthy participants. Linear descriptive

statistics and a parameterisation of multiFRACTAL singularity spectra were

determined for inter-beat (RR) and intra-beat (QT) time-series before and after

exercise. Multivariate analyses of both linear and multiFRACTAL measures

discriminated between pre- and post-exercise periods and proportionally more

significant correlations were observed between linear than between multiFRACTAL

measures. Variance was more uniformly distributed over the first three principal

components for multiFRACTAL measures and the two classes of measures were

uncorrelated. Order and phase randomisation of the time-series indicated that

both sample distribution and correlation properties contribute to

multiFRACTALilty. This exploratory study indicates the possibility of using

physical exercise in conjunction with multiFRACTAL methodology as an adjunctive

description of autonomically mediated modulation of heart rate.

Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

PMID: 22459102 [PubMed - indexed for MEDLINE]

70. Phys Rev E Stat Nonlin Soft Matter Phys. 2012 Jan;85(1 Pt 1):011123. Epub 2012

Jan 13.

Linear polymers in disordered media: the shortest, the longest, and the mean

self-avoiding walk on percolation clusters.

Janssen HK(1), Stenull O.

Author information:

(1)Institut für Theoretische Physik III, Heinrich-Heine-Universität, D-40225

Düsseldorf, Germany.

Long linear polymers in strongly disordered media are well described by

self-avoiding walks (SAWs) on percolation clusters and a lot can be learned about

the statistics of these polymers by studying the length distribution of SAWs on

percolation clusters. This distribution encompasses 2 distinct averages, viz.,

the average over the conformations of the underlying cluster and the SAW

conformations. For the latter average, there are two basic options, one being

static and one being kinetic. It is well known for static averaging that if the

disorder of the underlying medium is weak, this disorder is redundant in the

sense the renormalization group; i.e., differences to the ordered case appear

merely in nonuniversal quantities. Using dynamical field theory, we show that the

same holds true for kinetic averaging. Our main focus, however, lies on strong

disorder, i.e., the medium being close to the percolation point, where disorder

is relevant. Employing a field theory for the nonlinear random resistor network

in conjunction with a real-world interpretation of the corresponding Feynman

diagrams, we calculate the scaling exponents for the shortest, the longest, and

the mean or average SAW to 2-loop order. In addition, we calculate to 2-loop

order the entire family of multiFRACTAL exponents that governs the moments of the

the statistical weights of the elementary constituents (bonds or sites of the

underlying FRACTAL cluster) contributing to the SAWs. Our RG analysis reveals

that kinetic averaging leads to renormalizability whereas static averaging does

not, and hence, we argue that the latter does not lead to a well-defined scaling

limit. We discuss the possible implications of this finding for experiments and

numerical simulations which have produced widespread results for the exponent of

the average SAW. To corroborate our results, we also study the well-known

Meir-Harris model for SAWs on percolation clusters. We demonstrate that the

Meir-Harris model leads back up to 2-loop order to the renormalizable real-world

formulation with kinetic averaging if the replica limit is consistently performed

at the first possible instant in the course of the calculation.

© 2012 American Physical Society

PMID: 22400528 [PubMed - indexed for MEDLINE]

71. Biomed Microdevices. 2012 Jun;14(3):541-8. doi: 10.1007/s10544-012-9631-1.

Application of multiFRACTAL analysis on microscopic images in the classification

of metastatic bone disease.

Vasiljevic J(1), Reljin B, Sopta J, Mijucic V, Tulic G, Reljin I.

Author information:

(1)Institute "Mihajlo Pupin", University of Belgrade, Volgina 15, Belgrade, Serbia.

The paper considers the method, based on multiFRACTAL (MF) analysis, for

classifying the shape of tissue cells from microscopis images, identifying the

primary cancer in cases of metastasis bone disease. Diagnosis of primary cancer

is of great importance, because further treatment depends on how successful and

accurate that diagnosis is. This method can be applied as an additional and

objective tool in primary cancer diagnosis, as well as in decreasing of the

subjective factor and error probability. The method is tested over a large number

(1050) of clinical cases from the Institute of Pathology, University of Belgrade.

The results of computer-aided analysis of images have been presented and


PMID: 22327812 [PubMed - indexed for MEDLINE]

72. Phys Rev E Stat Nonlin Soft Matter Phys. 2011 Dec;84(6 Pt 2):066123. Epub 2011

Dec 27.

MonoFRACTAL and multiFRACTAL analysis of the spatial distribution of earthquakes

in the central zone of Chile.

Pastén D(1), Muñoz V, Cisternas A, Rogan J, Valdivia JA.

Author information:

(1)Departamento de Física, Facultad de Ciencias, Universidad de Chile, Casilla 653,

Santiago, Chile.

Statistical and FRACTAL properties of the spatial distribution of earthquakes in

the central zone of Chile are studied. In particular, data are shown to behave

according to the well-known Gutenberg-Richter law. The FRACTAL structure is

evident for epicenters, not for hypocenters. The multiFRACTAL spectrum is also

determined, both for the spatial distribution of epicenters and hypocenters. For

negative values of the index of multiFRACTAL measure q, the multiFRACTAL

spectrum, which usually cannot be reliably found from data, is calculated from a

generalized Cantor-set model, which fits the multiFRACTAL spectrum for q > 0, a

technique which has been previously applied for analysis of solar wind data.

PMID: 22304171 [PubMed]

73. Top Cogn Sci. 2012 Jan;4(1):87-93; discussion 94-102. doi:

10.1111/j.1756-8765.2011.01164.x. Epub 2011 Oct 24.

Abstract concepts require concrete models: why cognitive scientists have not yet

embraced nonlinearly coupled, dynamical, self-organized critical, synergistic,

scale-free, exquisitely context-sensitive, interaction-dominant, multiFRACTAL,

interdependent brain-body-niche systems.

Wagenmakers EJ(1), van der Maas HL, Farrell S.

Author information:

(1)Department of Psychology, University of Amsterdam, Amsterdam, The Netherlands.

Comment on

Top Cogn Sci. 2012 Jan;4(1):7-20.

Top Cogn Sci. 2012 Jan;4(1):35-50.

Top Cogn Sci. 2012 Jan;4(1):21-34.

Top Cogn Sci. 2012 Jan;4(1):51-62.

After more than 15 years of study, the 1/f noise or complex-systems approach to

cognitive science has delivered promises of progress, colorful verbiage, and

statistical analyses of phenomena whose relevance for cognition remains unclear.

What the complex-systems approach has arguably failed to deliver are concrete

insights about how people perceive, think, decide, and act. Without formal models

that implement the proposed abstract concepts, the complex-systems approach to

cognitive science runs the danger of becoming a philosophical exercise in

futility. The complex-systems approach can be informative and innovative, but

only if it is implemented as a formal model that allows concrete prediction,

falsification, and comparison against more traditional approaches.

Copyright © 2011 Cognitive Science Society, Inc.

PMID: 22253182 [PubMed - indexed for MEDLINE]

74. Top Cogn Sci. 2012 Jan;4(1):51-62. doi: 10.1111/j.1756-8765.2011.01162.x. Epub

2011 Oct 24.

MultiFRACTAL dynamics in the emergence of cognitive structure.

Dixon JA(1), Holden JG, Mirman D, Stephen DG.

Author information:

(1)Department of Psychology, University of Connecticut, USA.

Comment in

Top Cogn Sci. 2012 Jan;4(1):72-7; discussion 94-102.

Top Cogn Sci. 2012 Jan;4(1):87-93; discussion 94-102.

Top Cogn Sci. 2012 Jan;4(1):63-71; discussion 94-102.

Top Cogn Sci. 2012 Jan;4(1):84-6; discussion 94-102.

Top Cogn Sci. 2012 Jan;4(1):78-83; discussion 94-102.

The complex-systems approach to cognitive science seeks to move beyond the

formalism of information exchange and to situate cognition within the broader

formalism of energy flow. Changes in cognitive performance exhibit a FRACTAL

(i.e., power-law) relationship between size and time scale. These FRACTAL

fluctuations reflect the flow of energy at all scales governing cognition.

Information transfer, as traditionally understood in the cognitive sciences, may

be a subset of this multiscale energy flow. The cognitive system exhibits not

just a single power-law relationship between fluctuation size and time scale but

actually exhibits many power-law relationships, whether over time or space. This

change in FRACTAL scaling, that is, multiFRACTALity, provides new insights into

changes in energy flow through the cognitive system. We survey recent findings

demonstrating the role of multiFRACTALity in (a) understanding atypical

developmental outcomes, and (b) predicting cognitive change. We propose that

multiFRACTALity provides insights into energy flows driving the emergence of

cognitive structure.

Copyright © 2011 Cognitive Science Society, Inc.

PMID: 22253177 [PubMed - indexed for MEDLINE]

75. PLoS One. 2012;7(1):e29956. doi: 10.1371/journal.pone.0029956. Epub 2012 Jan 6.

Comparing monoFRACTAL and multiFRACTAL analysis of corrosion damage evolution in

reinforcing bars.

Xu Y(1), Qian C, Pan L, Wang B, Lou C.

Author information:

(1)School of Materials Science and Engineering, Southeast University, Nanjing,

Jiangsu, People's Republic of China.

Based on FRACTAL theory and damage mechanics, the aim of this paper is to

describe the monoFRACTAL and multiFRACTAL characteristics of corrosion morphology

and develop a new approach to characterize the nonuniform corrosion degree of

reinforcing bars. The relationship between FRACTAL parameters and tensile

strength of reinforcing bars are discussed. The results showed that corrosion

mass loss ratio of a bar cannot accurately reflect the damage degree of the bar.

The corrosion morphology of reinforcing bars exhibits both monoFRACTAL and

multiFRACTAL features. The FRACTAL dimension and the tensile strength of corroded

steel bars exhibit a power function relationship, while the width of multiFRACTAL

spectrum and tensile strength of corroded steel bars exhibit a linear

relationship. By comparison, using width of multiFRACTAL spectrum as multiFRACTAL

damage variable not only reflects the distribution of corrosion damage in

reinforcing bars, but also reveals the influence of nonuniform corrosion on the

mechanical properties of reinforcing bars. The present research provides a new

approach for the establishment of corrosion damage constitutive models of

reinforcing bars.

PMCID: PMC3253123

PMID: 22238682 [PubMed - indexed for MEDLINE]

76. Anal Cell Pathol (Amst). 2012;35(2):123-6. doi: 10.3233/ACP-2011-0045.

MultiFRACTAL feature descriptor for histopathology.

Atupelage C(1), Nagahashi H, Yamaguchi M, Sakamoto M, Hashiguchi A.

Author information:

(1)Department of Computational Intelligence and Systems Science, Tokyo Institute of

Technology, Tokyo, Japan.

BACKGROUND: Histologic image analysis plays an important role in cancer

diagnosis. It describes the structure of the body tissues and abnormal structure

gives the suspicion of the cancer or some other diseases. Observing the

structural changes of these chaotic textures from the human eye is challenging

process. However, the challenge can be defeat by forming mathematical descriptor

to represent the histologic texture and classify the structural changes via a

sophisticated computational method.

OBJECTIVE: In this paper, we propose a texture descriptor to observe the

histologic texture into highly discriminative feature space.

METHOD: FRACTAL dimension describes the self-similar structures in different and

more accurate manner than topological dimension. Further, the FRACTAL phenomenon

has been extended to natural structures (images) as multiFRACTAL dimension. We

exploited the multiFRACTAL analysis to represent the histologic texture, which

derive more discriminative feature space for classification.

RESULTS: We utilized a set of histologic images (belongs to liver and prostate

specimens) to assess the discriminative power of the multiFRACTAL features. The

experiment was organized to classify the given histologic texture as cancer and

non-cancer. The results show the discrimination capability of multiFRACTAL

features by achieving approximately 95% of correct classification rate.

CONCLUSION: MultiFRACTAL features are more effective to describe the histologic

texture. The proposed feature descriptor showed high classification rate for both

liver and prostate data sample datasets.

PMID: 22101185 [PubMed - indexed for MEDLINE]

77. Phys Rev E Stat Nonlin Soft Matter Phys. 2011 Sep;84(3 Pt 2):036212. Epub 2011

Sep 22.

Perturbation approach to multiFRACTAL dimensions for certain critical

random-matrix ensembles.

Bogomolny E(1), Giraud O.

Author information:

(1)Univ. Paris-Sud, CNRS, LPTMS, UMR 8626, Orsay, F-91405, France.

FRACTAL dimensions of eigenfunctions for various critical random matrix ensembles

are investigated in perturbation series in the regimes of strong and weak

multiFRACTALity. In both regimes, we obtain expressions similar to those of the

critical banded random matrix ensemble extensively discussed in the literature.

For certain ensembles, the leading-order term for weak multiFRACTALity can be

calculated within standard perturbation theory. For other models, such a direct

approach requires modifications, which are briefly discussed. Our analytical

formulas are in good agreement with numerical calculations.

PMID: 22060480 [PubMed]

78. Phys Rev E Stat Nonlin Soft Matter Phys. 2011 Sep;84(3 Pt 2):036118. Epub 2011

Sep 30.

MultiFRACTALity of complex networks.

Furuya S(1), Yakubo K.

Author information:

(1)Department of Mathematical Informatics, The University of Tokyo, Tokyo 113-8656,


We demonstrate analytically and numerically the possibility that the FRACTAL

property of a scale-free network cannot be characterized by a unique FRACTAL

dimension and the network takes a multiFRACTAL structure. It is found that the

mass exponents τ(q) for several deterministic, stochastic, and real-world FRACTAL

scale-free networks are nonlinear functions of q, which implies that structural

measures of these networks obey the multiFRACTAL scaling. In addition, we give a

general expression of τ(q) for some class of FRACTAL scale-free networks by a

mean-field approximation. The multiFRACTAL property of network structures is a

consequence of large fluctuations of local node density in scale-free networks.

PMID: 22060467 [PubMed]

79. Phys Rev E Stat Nonlin Soft Matter Phys. 2011 Sep;84(3 Pt 1):031918. Epub 2011

Sep 19.

MultiFRACTAL analysis of thermal denaturation based on the Peyrard-Bishop-Dauxois


Behnia S(1), Akhshani A, Panahi M, Mobaraki A, Ghaderian M.

Author information:

(1)Department of Physics, Urmia University of Technology, Orumieh, Iran.

The theory of DNA dynamics is exceedingly complex and not easily explained. In

the past two decades, by adapting methods of statistical physics, the dynamics of

DNA in contact with a thermal bath is widely studied. In this paper, the thermal

denaturation of DNA in the framework of the Peyrard-Bishop-Dauxois (PBD) model

through the Rényi dimension is investigated. As a result, the Rényi dimension

spectrum of the melting transition process reveals the multiFRACTAL nature of the

dynamics of the Peyrard-Bishop-Dauxois model. Also, it can be concluded that the

Rényi dimension (D(q)) at negative values of q is the characteristic signature of

pre-melting and thermal denaturation of DNA. Furthermore, this approach is in

excellent agreement with previous experimental studies.

PMID: 22060414 [PubMed - indexed for MEDLINE]

80. Front Integr Neurosci. 2011 Oct 17;5:62. doi: 10.3389/fnint.2011.00062.

eCollection 2011.

Effects of accuracy feedback on FRACTAL characteristics of time estimation.

Kuznetsov NA(1), Wallot S.

Author information:

(1)Perceptual-Motor Dynamics Laboratory, Department of Psychology, CAP Center for

Cognition, Action and Perception, University of Cincinnati Cincinnati, OH, USA.

The current experiment investigated the effect of visual accuracy feedback on the

structure of variability of time interval estimates in the continuation tapping

paradigm. Participants were asked to repeatedly estimate a 1-s interval for a

prolonged period of time by tapping their index finger. In some conditions,

participants received accuracy feedback after every estimate, whereas in other

conditions, no feedback was given. Also, the likelihood of receiving visual

feedback was manipulated by adjusting the tolerance band around the 1-s target

interval so that feedback was displayed only if the temporal estimate deviated

from the target interval by more than 50, 100, or 200 ms respectively. We

analyzed the structure of variability of the inter-tap intervals with FRACTAL and

multiFRACTAL methods that allow for a quantification of complex long-range

correlation patterns in the timing performance. Our results indicate that

feedback changes the long-range correlation structure of time estimates:

Increased amounts of feedback lead to a decrease in FRACTAL long-range

correlations, as well to a decrease in the magnitude of local fluctuations in the

performance. The multiFRACTAL characteristics of the time estimates were not

impacted by the presence of accuracy feedback. Nevertheless, most of the data

sets show significant multiFRACTAL signatures. We interpret these findings as

showing that feedback acts to constrain and possibly reorganize timing

performance. Implications for mechanistic and complex systems-based theories of

timing behavior are discussed.

PMCID: PMC3201842

PMID: 22046149 [PubMed]

81. BMC Genomics. 2011 Oct 14;12:506. doi: 10.1186/1471-2164-12-506.

The human genome: a multiFRACTAL analysis.

Moreno PA(1), Vélez PE, Martínez E, Garreta LE, Díaz N, Amador S, Tischer I,

Gutiérrez JM, Naik AK, Tobar F, García F.

Author information:

(1)Escuela de Ingeniería de Sistemas y Computación, Universidad del Valle, Santiago

de Cali, Colombia.

BACKGROUND: Several studies have shown that genomes can be studied via a

multiFRACTAL formalism. Recently, we used a multiFRACTAL approach to study the

genetic information content of the Caenorhabditis elegans genome. Here we

investigate the possibility that the human genome shows a similar behavior to

that observed in the nematode.

RESULTS: We report here multiFRACTALity in the human genome sequence. This

behavior correlates strongly on the presence of Alu elements and to a lesser

extent on CpG islands and (G+C) content. In contrast, no or low relationship was

found for LINE, MIR, MER, LTRs elements and DNA regions poor in genetic

information. Gene function, cluster of orthologous genes, metabolic pathways, and

exons tended to increase their frequencies with ranges of multiFRACTALity and

large gene families were located in genomic regions with varied multiFRACTALity.

Additionally, a multiFRACTAL map and classification for human chromosomes are


CONCLUSIONS: Based on these findings, we propose a descriptive non-linear model

for the structure of the human genome, with some biological implications. This

model reveals 1) a multiFRACTAL regionalization where many regions coexist that

are far from equilibrium and 2) this non-linear organization has significant

molecular and medical genetic implications for understanding the role of Alu

elements in genome stability and structure of the human genome. Given the role of

Alu sequences in gene regulation, genetic diseases, human genetic diversity,

adaptation and phylogenetic analyses, these quantifications are especially


PMCID: PMC3277318

PMID: 21999602 [PubMed - indexed for MEDLINE]

82. Anal Quant Cytol Histol. 2011 Aug;33(4):211-4.

MultiFRACTAL spectrum of chorionic villi: a novel approach.

Dey P.

Author information:

Department of Cytology, Postgraduate Institute of Medical Education and Research,

Chandigarh, India.

OBJECTIVE: To measure the multiFRACTAL dimension in histopathology sections of

chorionic villi to study its role to distinguish between normal chorionic villi

and hydatidiform mole.

STUDY DESIGN: MultiFRACTAL spectrum was measured in 10 each cases of normal

chorionic villi and hydatidiform mole. The values of alpha max and alpha min and

their difference Delta alpha (alpha max--alpha min) were recorded in each case.

The data for these groups were compared.

RESULTS: The mean +/- SD of alpha max, alpha min, and Delta alpha (alpha

max--alpha min) of normal chorionic villi were 2.6335 +/- 0.16109, 1.6975 +/-

0.04435, and 0.9360 +/- 0.12725, respectively. Whereas the mean +/- SD of alpha

max, alpha min, and Delta of hydatidiform moles were 2.3196 +/- 0.11937, 1.6209

+/- 0.06208, and 0.7000 +/- 0.08350, respectively. The mean alpha max, alpha min,

and Delta alpha of normal chorionic villi were much higher than for hydatidiform

mole. Independent sample t-test shows significant difference (p < 0.001) in alpha

max, alpha min, and Delta alpha of normal chorionic villi and hydatidiform mole.

CONCLUSION: MultiFRACTAL dimension was significantly different in normal

chorionic villi and hydatidiform mole.

PMID: 21980625 [PubMed - indexed for MEDLINE]

83. J Phys Condens Matter. 2011 Oct 19;23(41):415601. doi:


Criticality without self-similarity: a 2D system with random long-range hopping.

Ossipov A(1), Rushkin I, Cuevas E.

Author information:

(1)School of Mathematical Sciences, University of Nottingham, Nottingham NG7 2RD,


We consider a simple model of quantum disorder in two dimensions, characterized

by a long-range site-to-site hopping. The system undergoes a metal–insulator

transition--its eigenfunctions change from being extended to being localized. We

demonstrate that at the point of the transition the nature of the eigenfunctions

depends crucially on the magnitude of the hopping amplitude. At small amplitudes

they are strongly multiFRACTAL. In the opposite limit of large amplitudes, the

eigenfunctions do not become FRACTAL. Their density moments do not scale as a

power of the system size; instead our result suggests a power of the logarithm of

the system size. In this regard, the transition differs from a similar one in the

one-dimensional version of the same system, as well as from the conventional

Anderson transition in more than two dimensions.

© 2011 IOP Publishing Ltd

PMID: 21959771 [PubMed]

84. J Rehabil Res Dev. 2011;48(7):787-800.

Using multiFRACTAL detrended fluctuation analysis to assess sacral skin blood

flow oscillations in people with spinal cord injury.

Liao F(1), Jan YK.

Author information:

(1)Department of Rehabilitation Sciences, University of Oklahoma Health Sciences

Center, Oklahoma City, 73117, USA.

The purpose of this study was to investigate whether the multiFRACTAL detrended

fluctuation analysis (MDFA) of skin blood flow oscillations (BFO) differed

between nondisabled controls and people with spinal cord injury (SCI). The study

of skin BFO has shown promise for assessing blood flow control mechanisms and

risk for pressure ulcers. We recruited 23 subjects, including 11 people with SCI

and 12 nondisabled controls. Thermally induced maximal sacral skin BFO were

measured by laser Doppler flowmetry. MDFA was used to characterize nonlinear

complexity of metabolic (0.0095 to 0.02 Hz), neurogenic (0.02 to 0.05 Hz), and

myogenic (0.05 to 0.15 Hz) BFO. We found that maximal vasodilation was

significantly smaller in people with SCI than in nondisabled controls (p < 0.05).

MDFA showed that metabolic BFO exhibited less complexity in people with SCI (p <

0.05), neurogenic BFO exhibited less complexity in people with complete SCI (p <

0.05), and myogenic BFO did not show significant differences between people with

SCI and nondisabled controls. This study demonstrated the feasibility of using

the MDFA to characterize nonlinear complexity of BFO, which is related to

vasodilatory functions in people with SCI.

PMID: 21938665 [PubMed - indexed for MEDLINE]

85. Physiol Meas. 2011 Oct;32(10):1681-99. doi: 10.1088/0967-3334/32/10/014. Epub

2011 Sep 19.

Aging in autonomic control by multiFRACTAL studies of cardiac interbeat intervals

in the VLF band.

Makowiec D(1), Rynkiewicz A, Wdowczyk-Szulc J, Zarczyńska-Buchowiecka M, Gałaska

R, Kryszewski S.

Author information:

(1)Institute of Theoretical Physics and Astrophysics, University of Gdańsk, 80-952

Gdańsk, ul Wita Stwosza 57, Poland.

The heart rate responds dynamically to various intrinsic and environmental

stimuli. The autonomic nervous system is said to play a major role in this

response. MultiFRACTAL analysis offers a novel method to assess the response of

cardiac interbeat intervals. Twenty-four hour ECG recordings of RR interbeat

intervals (of 48 elderly volunteers (age 65-94), 40 middle-aged persons (age

45-53) and 36 young adults (age 18-26)) were investigated to study the effect of

aging on autonomic regulation during normal activity in healthy adults. Heart

RR-interval variability in the very low frequency (VLF) band (32-420 RR

intervals) was evaluated by multiFRACTAL tools. The nocturnal and diurnal signals

of 6 h duration were studied separately. For each signal, the analysis was

performed twice: for a given signal and for the integrated signal. A multiFRACTAL

spectrum was quantified by the h(max) value at which a multiFRACTAL spectrum

attained its maximum, width of a spectrum, Hurst exponent, extreme events h(left)

and distance between the maxima of a signal and its integrated counterpart. The

following seven characteristics are suggested as quantifying the age-related

decrease in the autonomic function ('int' refers to the integrated signal): (a)

h(sleep)(max) - h(max)(wake) > 0.05 for a signal; (b) h(int)(max) > 1.15 for

wake; (c) h(int)(max) - h(max) > 0.85 for sleep; (d) Hurst(wake) - Hurst(sleep) <

0.01; (e) width(wake) > 0.07; (f) width(int) < 0.30 for sleep; (g) h(int)(left) >

0.75. Eighty-one percent of elderly people had at least four of these properties,

and ninety-two percent of young people had three or less. This shows that the

multiFRACTAL approach offers a concise and reliable index of healthy aging for

each individual. Additionally, the applied method yielded insights into dynamical

changes in the autonomic regulation due to the circadian cycle and aging. Our

observations support the hypothesis that imbalance in the autonomic control due

to healthy aging could be related to changes emerging from the vagal function

(Struzik et al 2006 IEEE Trans. Biomed. Eng. 53 89-94).

PMID: 21926460 [PubMed - indexed for MEDLINE]

86. PLoS One. 2011;6(9):e24331. doi: 10.1371/journal.pone.0024331. Epub 2011 Sep 6.

Facilitating joint chaos and FRACTAL analysis of biosignals through nonlinear

adaptive filtering.

Gao J(1), Hu J, Tung WW.

Author information:

(1)PMB Intelligence LLC, West Lafayette, Indiana, United States of America.

BACKGROUND: Chaos and random FRACTAL theories are among the most important for

fully characterizing nonlinear dynamics of complicated multiscale biosignals.

Chaos analysis requires that signals be relatively noise-free and stationary,

while FRACTAL analysis demands signals to be non-rhythmic and scale-free.

METHODOLOGY/PRINCIPAL FINDINGS: To facilitate joint chaos and FRACTAL analysis of

biosignals, we present an adaptive algorithm, which: (1) can readily remove

nonstationarities from the signal, (2) can more effectively reduce noise in the

signals than linear filters, wavelet denoising, and chaos-based noise reduction

techniques; (3) can readily decompose a multiscale biosignal into a series of

intrinsically bandlimited functions; and (4) offers a new formulation of FRACTAL

and multiFRACTAL analysis that is better than existing methods when a biosignal

contains a strong oscillatory component.

CONCLUSIONS: The presented approach is a valuable, versatile tool for the

analysis of various types of biological signals. Its effectiveness is

demonstrated by offering new important insights into brainwave dynamics and the

very high accuracy in automatically detecting epileptic seizures from EEG


PMCID: PMC3167840

PMID: 21915312 [PubMed - indexed for MEDLINE]

87. Phys Rev E Stat Nonlin Soft Matter Phys. 2011 Jul;84(1 Pt 2):016208. Epub 2011

Jul 14.

Arbitrary-order Hilbert spectral analysis for time series possessing scaling

statistics: comparison study with detrended fluctuation analysis and wavelet


Huang YX(1), Schmitt FG, Hermand JP, Gagne Y, Lu ZM, Liu YL.

Author information:

(1)Shanghai Institute of Applied Mathematics and Mechanics, Shanghai University,

Shanghai, China.

In this paper we present an extended version of Hilbert-Huang transform, namely

arbitrary-order Hilbert spectral analysis, to characterize the scale-invariant

properties of a time series directly in an amplitude-frequency space. We first

show numerically that due to a nonlinear distortion, traditional methods require

high-order harmonic components to represent nonlinear processes, except for the

Hilbert-based method. This will lead to an artificial energy flux from the

low-frequency (large scale) to the high-frequency (small scale) part. Thus the

power law, if it exists, is contaminated. We then compare the Hilbert method with

structure functions (SF), detrended fluctuation analysis (DFA), and wavelet

leader (WL) by analyzing fractional Brownian motion and synthesized multiFRACTAL

time series. For the former simulation, we find that all methods provide

comparable results. For the latter simulation, we perform simulations with an

intermittent parameter μ=0.15. We find that the SF underestimates scaling

exponent when q>3. The Hilbert method provides a slight underestimation when q>5.

However, both DFA and WL overestimate the scaling exponents when q>5. It seems

that Hilbert and DFA methods provide better singularity spectra than SF and WL.

We finally apply all methods to a passive scalar (temperature) data obtained from

a jet experiment with a Taylor's microscale Reynolds number Re(λ)≃250. Due to the

presence of strong ramp-cliff structures, the SF fails to detect the power law

behavior. For the traditional method, the ramp-cliff structure causes a serious

artificial energy flux from the low-frequency (large scale) to the high-frequency

(small scale) part. Thus DFA and WL underestimate the scaling exponents. However,

the Hilbert method provides scaling exponents ξ(θ)(q) quite close to the one for

longitudinal velocity, indicating a less intermittent passive scalar field than

what was believed before.

PMID: 21867274 [PubMed - indexed for MEDLINE]

88. Phys Rev Lett. 2011 Jul 8;107(2):028101. Epub 2011 Jul 8.

Cell surface as a FRACTAL: normal and cancerous cervical cells demonstrate

different FRACTAL behavior of surface adhesion maps at the nanoscale.

Dokukin ME(1), Guz NV, Gaikwad RM, Woodworth CD, Sokolov I.

Author information:

(1)Department of Physics, Clarkson University, Potsdam, New York 13699-5820, USA.

Here we show that the surface of human cervical epithelial cells demonstrates

substantially different FRACTAL behavior when the cell becomes cancerous.

Analyzing the adhesion maps of individual cervical cells, which were obtained

using the atomic force microscopy operating in the HarmoniX mode, we found that

cancerous cells demonstrate simple FRACTAL behavior, whereas normal cells can

only be approximated at best as multiFRACTAL. Tested on ~300 cells collected from

12 humans, the FRACTAL dimensionality of cancerous cells is found to be

unambiguously higher than that for normal cells.

PMID: 21797643 [PubMed - indexed for MEDLINE]

89. Phys Rev E Stat Nonlin Soft Matter Phys. 2011 Jun;83(6 Pt 2):066210. Epub 2011

Jun 22.

MultiFRACTAL analysis of nonhyperbolic coupled map lattices: application to

genomic sequences.

Provata A(1), Beck C.

Author information:

(1)Institute of Physical Chemistry, National Center for Scientific Research

Demokritos, GR-15310 Athens, Greece.

Symbolic sequences generated by coupled map lattices (CMLs) can be used to model

the chaotic-like structure of genomic sequences. In this study it is shown that

diffusively coupled Chebyshev maps of order 4 (corresponding to a shift of four

symbols) very closely reproduce the multiFRACTAL spectrum D(q) of human genomic

sequences for coupling constant α = 0.35 ± 0.01 if q > 0. The presence of rare

configurations causes deviations for q < 0, which disappear if the rare event

statistics of the CML is modified. Such rare configurations are known to play

specific functional roles in genomic sequences serving as promoters or regulatory


PMID: 21797464 [PubMed - indexed for MEDLINE]

90. Phys Rev E Stat Nonlin Soft Matter Phys. 2011 Mar;83(3 Pt 1):031801. Epub 2011

Mar 2.

Density fluctuations of polymers in disordered media.

Deutsch JM(1), de la Cruz MO.

Author information:

(1)Department of Physics, University of California, Santa Cruz, California 95064,


We study self-avoiding random walks in an environment where sites are excluded

randomly, in two and three dimensions. For a single polymer chain, we study the

statistics of the time averaged monomer density and show that these are well

described by multiFRACTAL statistics. This is true even far from the percolation

transition of the disordered medium. We investigate solutions of chains in a

disordered environment and show that the statistics cease to be multiFRACTAL

beyond the screening length of the solution.

PMID: 21517516 [PubMed - indexed for MEDLINE]

91. Guang Pu Xue Yu Guang Pu Fen Xi. 2011 Feb;31(2):473-7.

[FRACTAL characteristics of visible spectra across a hilly area].

[Article in Chinese]

Zhang FS(1), Liu ZX, Wan HL, Liu M.

Author information:

(1)Institute of Applied Ecology, Chinese Academy of Sciences, Key Laboratory of

Liaoning Water-Saving Agriculture, Shenyang 110016, China.

The spectral characteristic of remotely sensed image is mainly the results of

integrative effects on spectrum from heterogeneous ground reflectors.

Investigating its spatial distribution characteristics may be helpful for image

interpreting and modeling based on remote sensing technique. In the present

study, spatial heterogeneity of remotely sensed multispectral TM image across a

hilly area in late October was studied by the combination of statistical method

and multiFRACTAL analysis. The results showed that distribution of digital number

(DN) values of visible spectra (0.45-0.69 microm) had statistical

scale-invariance. The generalized FRACTAL dimension function D(q) suggested that

distribution of TM 2 (0.52-0.60 microm) DN values was monoFRACTAL type, whereas

DN values of TM 1 (0.45-0.52 microm) and TM 3 (0.63-0.69 microm) had multiFRACTAL

distribution characteristics. The parameters (alpha(max)-alpha(min)) and

[f(a(max))-f(alpha(min))] of multiFRACTAL spectra further indicated that TM 3 DN

values had the high est spatial heterogeneity and most abundant information,

followed by TM 1, while the extremely narrow spectrum of TM 2 DN values showed

its relatively low spatial heterogeneity and information capacity.

PMID: 21510407 [PubMed]

92. Med Biol Eng Comput. 2011 Aug;49(8):925-34. doi: 10.1007/s11517-011-0775-6. Epub

2011 Apr 13.

MultiFRACTAL analysis of nonlinear complexity of sacral skin blood flow

oscillations in older adults.

Liao F(1), Struck BD, Macrobert M, Jan YK.

Author information:

(1)Department of Rehabilitation Sciences, University of Oklahoma Health Sciences

Center, Oklahoma City, OK, USA.

The objective of this study was to investigate the relationship between cutaneous

vasodilatory function and nonlinear complexity of blood flow oscillations (BFO)

in older people. A non-painful fast local heating protocol was applied to the

sacral skin in 20 older subjects with various vasodilatory functions. Laser

Doppler flowmetry was used to measure skin blood oscillations. The complexity of

the characteristic frequencies (i.e., metabolic (0.0095-0.02 Hz), neurogenic

(0.02-0.05 Hz), myogenic (0.05-0.15 Hz), respiratory (0.15-0.4 Hz), and cardiac

(0.4-2 Hz)) of BFO was quantified using the multiFRACTAL detrended fluctuation

analysis. Compared with the 65-75 years group, the complexity of metabolic BFO in

the 75-85 years group was significantly lower at the baseline (P < 0.05) and the

second peak (P < 0.001). Compared with baseline BFO, subjects in the 65-75 years

group had a significant increase in the complexity of metabolic BFO (P < 0.01) in

response to local heating; while subjects in the 75-85 years group did not. Our

findings support the use of multiFRACTAL analysis to assess aging-related

microvascular dysfunction.

PMCID: PMC3140590

PMID: 21487818 [PubMed - indexed for MEDLINE]

93. Med Phys. 2011 Jan;38(1):83-95.

Prostate cancer characterization on MR images using FRACTAL features.

Lopes R(1), Ayache A, Makni N, Puech P, Villers A, Mordon S, Betrouni N.

Author information:

(1)Inserm, U703, Université Nord de France, 152 rue du Docteur Yersin, 59120 Loos,

CHRU Lille, France.

PURPOSE: Computerized detection of prostate cancer on T2-weighted MR images.

METHODS: The authors combined FRACTAL and multiFRACTAL features to perform

textural analysis of the images. The FRACTAL dimension was computed using the

Variance method; the multiFRACTAL spectrum was estimated by an adaptation of a

multifractional Brownian motion model. Voxels were labeled as tumor/nontumor via

nonlinear supervised classification. Two classification algorithms were tested:

Support vector machine (SVM) and AdaBoost.

RESULTS: Experiments were performed on images from 17 patients. Ground truth was

available from histological images. Detection and classification results

(sensitivity, specificity) were (83%, 91%) and (85%, 93%) for SVM and AdaBoost,


CONCLUSIONS: Classification using the authors' model combining FRACTAL and

multiFRACTAL features was more accurate than classification using classical

texture features (such as Haralick, wavelet, and Gabor filters). Moreover, the

method was more robust against signal intensity variations. Although the method

was only applied to T2 images, it could be extended to multispectral MR.

PMID: 21361178 [PubMed - indexed for MEDLINE]

94. Phys Rev E Stat Nonlin Soft Matter Phys. 2010 Nov;82(5 Pt 1):051133. Epub 2010

Nov 29.

MultiFRACTALity of instantaneous normal modes at mobility edges.

Huang BJ(1), Wu TM.

Author information:

(1)Institute of Physics, National Chiao-Tung University, HsinChu, Taiwan 300,

Republic of China.

In terms of the multiFRACTAL analysis, we investigate the characteristics of the

instantaneous normal modes (INMs) at two mobility edges (MEs) of a simple fluid,

where the locations of the MEs in the INM spectrum were identified in a previous

work [B. J. Huang and T. M. Wu, Phys. Rev. E 79, 041105 (2009)]. The mass

exponents and the singularity spectrum of the INMs are obtained by the box-size

and system-size scalings under the typical average. The INM eigenvectors at a ME

exhibit a multiFRACTAL nature and the multiFRACTAL INMs at each ME yield the same

results in generalized FRACTAL dimensions and singularity spectrum. Our results

indicate that the singularity spectrum of the multiFRACTAL INMs agrees well with

that of the Anderson model at the critical disorder. This good agreement provides

numerical evidence for the universal multiFRACTALity at the

localization-delocalization transition. For the multiFRACTAL INMs, the

probability density function and the spatial correlation function of the squared

vibrational amplitudes are also calculated. The relation between the probability

density function and the singularity spectrum is examined numerically, so are the

relations between the critical exponents of the spatial correlation function and

the mass exponents of the multiFRACTAL INMs.

PMID: 21230463 [PubMed]

95. Front Physiol. 2012 Feb 7;2:123. doi: 10.3389/fphys.2011.00123. eCollection 2011.

Integrated central-autonomic multiFRACTAL complexity in the heart rate

variability of healthy humans.

Lin DC(1), Sharif A.

Author information:

(1)Department of Mechanical and Industrial Engineering, Ryerson University Toronto,

ON, Canada.

PURPOSE OF STUDY: The aim of this study was to characterize the central-autonomic

interaction underlying the multiFRACTALity in heart rate variability (HRV) of

healthy humans.

MATERIALS AND METHODS: Eleven young healthy subjects participated in two separate

~40 min experimental sessions, one in supine (SUP) and one in, head-up-tilt

(HUT), upright (UPR) body positions. Surface scalp electroencephalography (EEG)

and electrocardiogram (ECG) were collected and FRACTAL correlation of brain and

heart rate data was analyzed based on the idea of relative multiFRACTALity. The

FRACTAL correlation was further examined with the EEG, HRV spectral measures

using linear regression of two variables and principal component analysis (PCA)

to find clues for the physiological processing underlying the central influence


RESULTS: We report evidence of a central-autonomic FRACTAL correlation (CAFC)

where the HRV multiFRACTAL complexity varies significantly with the FRACTAL

correlation between the heart rate and brain data (P = 0.003). The linear

regression shows significant correlation between CAFC measure and EEG Beta band

spectral component (P = 0.01 for SUP and P = 0.002 for UPR positions). There is

significant correlation between CAFC measure and HRV LF component in the SUP

position (P = 0.04), whereas the correlation with the HRV HF component approaches

significance (P = 0.07). The correlation between CAFC measure and HRV spectral

measures in the UPR position is weak. The PCA results confirm these findings and

further imply multiple physiological processes underlying CAFC, highlighting the

importance of the EEG Alpha, Beta band, and the HRV LF, HF spectral measures in

the supine position.

DISCUSSION AND CONCLUSION: The findings of this work can be summarized into three

points: (i) Similar FRACTAL characteristics exist in the brain and heart rate

fluctuation and the change toward stronger FRACTAL correlation implies the change

toward more complex HRV multiFRACTALity. (ii) CAFC is likely contributed by

multiple physiological mechanisms, with its central elements mainly derived from

the EEG Alpha, Beta band dynamics. (iii) The CAFC in SUP and UPR positions is

qualitatively different, with a more predominant central influence in the FRACTAL

HRV of the UPR position.

PMCID: PMC3277279

PMID: 22403548 [PubMed]

96. Microcirculation. 2011 Feb;18(2):136-51. doi: 10.1111/j.1549-8719.2010.00075.x.

MultiFRACTAL and lacunarity analysis of microvascular morphology and remodeling.

Gould DJ(1), Vadakkan TJ, Poché RA, Dickinson ME.

Author information:

(1)Department of Bioengineering, Rice University, Houston, Texas, USA.

OBJECTIVE: Classical measures of vessel morphology, including diameter and

density, are employed to study microvasculature in endothelial membrane labeled

mice. These measurements prove sufficient for some studies; however, they are

less well suited for quantifying changes in microcirculatory networks lacking

hierarchical structure. We demonstrate that automated multiFRACTAL analysis and

lacunarity may be used with classical methods to quantify microvascular


METHODS: Using multiFRACTAL analysis and lacunarity, we present an automated

extraction tool with a processing pipeline to characterize 2D representations of

3D microvasculature. We apply our analysis on four tissues and the hyaloid

vasculature during remodeling.

RESULTS: We found that the vessel networks analyzed have multiFRACTAL geometries

and that kidney microvasculature has the largest FRACTAL dimension and the lowest

lacunarity compared to microvasculature networks in the cortex, skin, and thigh

muscle. Also, we found that, during hyaloid remodeling, there were differences in

multiFRACTAL spectra reflecting the functional transition from a space filling

vasculature which nurtures the lens to a less dense vasculature as it regresses,

permitting unobstructed vision.

CONCLUSION: MultiFRACTAL analysis and lacunarity are valuable additions to

classical measures of vascular morphology and will have utility in future studies

of normal, developing, and pathological tissues.

© 2011 John Wiley & Sons Ltd.

PMCID: PMC3049800

PMID: 21166933 [PubMed - indexed for MEDLINE]

97. Brain Res Bull. 2011 Apr 5;84(6):359-75. doi: 10.1016/j.brainresbull.2010.12.005.

Epub 2010 Dec 13.

Comparison of FRACTAL and power spectral EEG features: effects of topography and

sleep stages.

Weiss B(1), Clemens Z, Bódizs R, Halász P.

Author information:

(1)Faculty of Information Technology, Pázmány Péter Catholic University, Práter u.

50/a, Budapest, Hungary.

FRACTAL nature of the human sleep EEG was revealed recently. In the literature

there are some attempts to relate FRACTAL features to spectral properties.

However, a comprehensive assessment of the relationship between FRACTAL and power

spectral measures is still missing. Therefore, in the present study we

investigated the relationship of monoFRACTAL and multiFRACTAL EEG measures (H and

ΔD) with relative band powers and spectral edge frequency across different sleep

stages and topographic locations. In addition we tested sleep stage

classification capability of these measures according to different channels. We

found that cross-correlations between FRACTAL and spectral measures as well as

between H and ΔD exhibit specific topographic and sleep stage-related

characteristics. Best sleep stage classifications were achieved by estimating

measure ΔD in temporal EEG channels both at group and individual levels,

suggesting that assessing multiFRACTALity might be an adequate approach for

compact modeling of brain activities.

Copyright © 2010 Elsevier Inc. All rights reserved.

PMID: 21147200 [PubMed - indexed for MEDLINE]

98. Conf Proc IEEE Eng Med Biol Soc. 2010;2010:110-3. doi:


Wavelet leader based multiFRACTAL analysis of heart rate variability during

myocardial ischaemia.

Leonarduzzi RF(1), Schlotthauer G, Torres ME.

Author information:

(1)Lab. Signals and Nonlinear Dynamics, Faculty of Engineering, Universidad Nacional

de Entre Ríos, Argentina.

Heart rate variability is a non invasive and indirect measure of the autonomic

control of the heart. Therefore, alterations to this control system caused by

myocardial ischaemia are reflected in changes in the complex and irregular

fluctuations of this signal. MultiFRACTAL analysis is a well suited tool for the

analysis of this kind of fluctuations, since it gives a description of the

singular behavior of a signal. Recently, a new approach for multiFRACTAL analysis

was proposed, the wavelet leader based multiFRACTAL formalism, which shows

remarkable improvements over previous methods. In order to characterize and

detect ischaemic episodes, in this work we propose to perform a short-time

windowed wavelet leader based multiFRACTAL analysis. Our results suggest that

this new method provides appropriate indexes that could be used as a tool for the

detection of myocardial ischaemia.

PMID: 21095648 [PubMed - indexed for MEDLINE]

99. Conf Proc IEEE Eng Med Biol Soc. 2010;2010:106-9. doi:


Methodology for multiFRACTAL analysis of heart rate variability: from LF/HF ratio

to wavelet leaders.

Abry P, Wendt H, Jaffard S, Helgason H, Goncalves P, Pereira E, Gharib C,

Gaucherand P, Doret M.

The present contribution aims at proposing a comprehensive and tutorial

introduction to the practical use of wavelet Leader based multiFRACTAL analysis

to study heart rate variability. First, the theoretical background is recalled.

Second, practical issues and pitfalls related to the selection of the scaling

range or statistical orders, minimal regularity, parabolic approximation of

spectrum and parameter estimation, are discussed. Third, multiFRACTAL analysis is

connected explicitly to other standard characterizations of heart rate

variability: (mono)FRACTAL analysis, Hurst exponent, spectral analysis and the

HF/LF ratio. This review is illustrated on real per partum fetal ECG data,

collected at an academic French public hospital, for both healthy fetuses and

fetuses suffering from acidosis.

PMID: 21095647 [PubMed - indexed for MEDLINE]

100. Am J Perinatol. 2011 Apr;28(4):259-66. doi: 10.1055/s-0030-1268713. Epub 2010 Nov


MultiFRACTAL analysis of fetal heart rate variability in fetuses with and without

severe acidosis during labor.

Doret M(1), Helgason H, Abry P, Goncalves P, Gharib C, Gaucherand P.

Author information:

(1)Hospices Civils de Lyon, Hôpital Femme-Mère-Enfant, service de

gynécologie-obstétrique, Bron, France.

We performed multiFRACTAL analysis of fetal heart rate (FHR) variability in

fetuses with and without acidosis during labor. MultiFRACTAL analysis was

performed on fetal electrocardiograms in 10-minute sliding windows within the

last 2 hours before delivery in 45 term fetuses divided in three groups according

to umbilical arterial pH and FHR pattern: group A had pH ≥7.30 and normal FHR,

group B had pH ≥7.30 and intermediate or abnormal FHR, and group C had acidosis

(pH ≤7.05) and intermediate or abnormal FHR. Six multiFRACTAL parameters were

compared using Wilcoxon rank sum test. MultiFRACTAL parameters were significantly

different between the three groups in the last 10 minutes before delivery (P

<0.05). Two parameters (H(min), zeta(2)) exhibited a significant difference 70

minutes before delivery, and one parameter (C(2)) was different 10 minutes before

birth (P <0.05). MultiFRACTAL parameters were significantly different in acidotic

and nonacidotic fetuses, independently from FHR pattern.

© Thieme Medical Publishers.

PMID: 21089007 [PubMed - indexed for MEDLINE]

101. Phys Med Biol. 2010 Oct 21;55(20):6279-97. doi: 10.1088/0031-9155/55/20/015. Epub

2010 Oct 6.

MultiFRACTAL analysis of heart rate variability and laser Doppler flowmetry

fluctuations:comparison of results from different numerical methods.

Humeau A(1), Buard B, Mahé G, Chapeau-Blondeau F, Rousseau D, Abraham P.

Author information:

(1)Laboratoire d'Ingénierie des Systèmes Automatisés, Université d'Angers, 62 avenue

Notre Dame du Lac, 49000 Angers, France.

To contribute to the understanding of the complex dynamics in the cardiovascular

system (CVS), the central CVS has previously been analyzed through multiFRACTAL

analyses of heart rate variability (HRV) signals that were shown to bring useful

contributions. Similar approaches for the peripheral CVS through the analysis of

laser Doppler flowmetry (LDF) signals are comparatively very recent. In this

direction, we propose here a study of the peripheral CVS through a multiFRACTAL

analysis of LDF fluctuations, together with a comparison of the results with

those obtained on HRV fluctuations simultaneously recorded. To perform these

investigations concerning the biophysics of the CVS, first we have to address the

problem of selecting a suitable methodology for multiFRACTAL analysis, allowing

us to extract meaningful interpretations on biophysical signals. For this

purpose, we test four existing methodologies of multiFRACTAL analysis. We also

present a comparison of their applicability and interpretability when implemented

on both simulated multiFRACTAL signals of reference and on experimental signals

from the CVS. One essential outcome of the study is that the multiFRACTAL

properties observed from both the LDF fluctuations (peripheral CVS) and the HRV

fluctuations (central CVS) appear very close and similar over the studied range

of scales relevant to physiology.

PMID: 20924134 [PubMed - indexed for MEDLINE]

102. J Chem Phys. 2010 Sep 28;133(12):124505. doi: 10.1063/1.3481099.

Molecular dynamics studies of ionically conducting glasses and ionic liquids:

wave number dependence of intermediate scattering function.

Habasaki J(1), Ngai KL.

Author information:

(1)Tokyo Institute of Technology, 4259 Nagatsuta-cho, Yokohama 226-8502, Japan.

Dynamical heterogeneity is a key feature to characterize both acceleration and

slowing down of the dynamics in interacting disordered materials. In the present

work, the heterogeneous ion dynamics in both ionically conducting glass and in

room temperature ionic liquids are characterized by the combination of the

concepts of Lévy distribution and multiFRACTALity. Molecular dynamics simulation

data of both systems are analyzed to obtain the fractional power law of the

k-dependence of the dynamics, which implies the Lévy distribution of length

scale. The multiFRACTALity of the motion and structures makes the system more

complex. Both contributions in the dynamics become separable by using g(k,t)

derived from the intermediate scattering function, F(s)(k,t). When the Lévy index

obtained from F(s)(k,t) is combined with FRACTAL dimension analysis of random

walks and multiFRACTAL analysis, all the spatial exponent controlling both fast

and slow dynamics are clarified. This analysis is generally applicable to other

complex interacting systems and is deemed beneficial for understanding their


PMID: 20886948 [PubMed]

103. Phys Rev E Stat Nonlin Soft Matter Phys. 2010 Jul;82(1 Pt 1):011136. Epub 2010

Jul 27.

Detrending moving average algorithm for multiFRACTALs.

Gu GF(1), Zhou WX.

Author information:

(1)School of Business, East China University of Science and Technology, Shanghai

200237, China.

The detrending moving average (DMA) algorithm is a widely used technique to

quantify the long-term correlations of nonstationary time series and the

long-range correlations of FRACTAL surfaces, which contains a parameter θ

determining the position of the detrending window. We develop multiFRACTAL

detrending moving average (MFDMA) algorithms for the analysis of one-dimensional

multiFRACTAL measures and higher-dimensional multiFRACTALs, which is a

generalization of the DMA method. The performance of the one-dimensional and

two-dimensional MFDMA methods is investigated using synthetic multiFRACTAL

measures with analytical solutions for backward (θ=0), centered (θ=0.5), and

forward (θ=1) detrending windows. We find that the estimated multiFRACTAL scaling

exponent τ(q) and the singularity spectrum f(α) are in good agreement with the

theoretical values. In addition, the backward MFDMA method has the best

performance, which provides the most accurate estimates of the scaling exponents

with lowest error bars, while the centered MFDMA method has the worse

performance. It is found that the backward MFDMA algorithm also outperforms the

multiFRACTAL detrended fluctuation analysis. The one-dimensional backward MFDMA

method is applied to analyzing the time series of Shanghai Stock Exchange

Composite Index and its multiFRACTAL nature is confirmed.

PMID: 20866594 [PubMed]

104. Phys Rev E Stat Nonlin Soft Matter Phys. 2010 Jun;81(6 Pt 2):066212. Epub 2010

Jun 18.

Generating a FRACTAL butterfly Floquet spectrum in a class of driven SU(2)

systems: eigenstate statistics.

Bandyopadhyay JN(1), Wang J, Gong J.

Author information:

(1)Department of Physics and Centre for Computational Science and Engineering,

National University of Singapore, Singapore 117542, Republic of Singapore.

The Floquet spectra of a class of driven SU(2) systems have been shown to display

butterfly patterns with multiFRACTAL properties. The implication of such critical

spectral behavior for the Floquet eigenstate statistics is studied in this work.

Following the methodologies for understanding the FRACTAL behavior of energy

eigenstates of time-independent systems on the Anderson transition point, we

analyze the distribution profile, the mean value, and the variance of the

logarithm of the inverse participation ratio of the Floquet eigenstates

associated with multiFRACTAL Floquet spectra. The results show that the Floquet

eigenstates also display FRACTAL behavior but with features markedly different

from those in time-independent Anderson-transition models. This motivated us to

propose random unitary matrix ensemble, called "power-law random banded unitary

matrix" ensemble, to illuminate the Floquet eigenstate statistics of critical

driven systems. The results based on the proposed random matrix model are

consistent with those obtained from our dynamical examples with or without

time-reversal symmetry.

PMID: 20866506 [PubMed]

105. J Opt Soc Am A Opt Image Sci Vis. 2010 Aug 1;27(8):1851-5. doi:


MultiFRACTAL zone plates.

Giménez F(1), Furlan WD, Calatayud A, Monsoriu JA.

Author information:

(1)I.U. Matemática Pura y Aplicada, Universidad Politécnica de Valencia, E-46022

Valencia, Spain.

We present multiFRACTAL zone plates (MFZPs) as what is to our knowledge a new

family of diffractive lenses whose structure is based on the combination of

FRACTAL zone plates (FZPs) of different orders. The typical result is a composite

of two FZPs with the central one having a first-order focal length f surrounded

by outer zones with a third-order focal length f. The focusing properties of

different members of this family are examined and compared with conventional

composite Fresnel zone plates. It is shown that MFZPs improve the axial

resolution and also give better performance under polychromatic illumination.

PMID: 20686590 [PubMed]

106. J Exp Psychol Gen. 2010 Aug;139(3):436-63. doi: 10.1037/a0019098.

Interaction-dominant dynamics in human cognition: beyond 1/f(alpha) fluctuation.

Ihlen EA(1), Vereijken B.

Author information:

(1)Human Movement Science Programme, Norwegian University of Science and Technology,

Trondheim, Norway.

It has been suggested that human behavior in general and cognitive performance in

particular emerge from coordination between multiple temporal scales. In this

article, we provide quantitative support for such a theory of

interaction-dominant dynamics in human cognition by using wavelet-based

multiFRACTAL analysis and accompanying multiplicative cascading process on the

response series of 4 different cognitive tasks: simple response, word naming,

choice decision, and interval estimation. Results indicated that the major

portion of these response series had multiplicative interactions between temporal

scales, visible as intermittent periods of large and irregular fluctuations

(i.e., a multiFRACTAL structure). Comparing 2 component-dominant models of

1/f(alpha) fluctuations in cognitive performance with the multiplicative

cascading process indicated that the multiFRACTAL structure could not be

replicated by these component-dominant models. Furthermore, a similar

multiFRACTAL structure was shown to be present in a model of self-organized

criticality in the human nervous system, similar to a spatial extension of the

multiplicative cascading process. These results illustrate that a wavelet-based

multiFRACTAL analysis and the multiplicative cascading process form an

appropriate framework to characterize interaction-dominant dynamics in human

cognition. This new framework goes beyond the identification of 1/f(alpha) power

laws and non-gaussian distributions in response series as used in previous

studies. The present article provides quantitative support for a paradigm shift

toward interaction-dominant dynamics in human cognition.

2010 APA, all rights reserved

PMID: 20677894 [PubMed - indexed for MEDLINE]

107. Med Phys. 2010 Jun;37(6):2827-36.

Generalized FRACTAL dimensions of laser Doppler flowmetry signals recorded from

glabrous and nonglabrous skin.

Buard B(1), Mahé G, Chapeau-Blondeau F, Rousseau D, Abraham P, Humeau A.

Author information:

(1)Groupe ESAIP, 18 rue du 8 mai 1945, BP 80022, 49180 Saint Barthélémy d'Anjou

Cedex, France.

PURPOSE: The technique of laser Doppler flowmetry (LDF) is commonly used to have

a peripheral view of the cardiovascular system. To better understand the

microvascular perfusion signals, the authors herein propose to analyze and

compare the complexity of LDF data recorded simultaneously in glabrous and

nonglabrous skin. Glabrous zones are physiologically different from the others

partly due to the presence of a high density of arteriovenous anastomoses.

METHODS: For this purpose, a multiFRACTAL analysis based on the partition

function and generalized FRACTAL dimensions computation is proposed. The LDF data

processed are recorded simultaneously on the right and left forearms and on the

right and left hand palms of healthy subjects. The signal processing method is

first tested on a multiFRACTAL binomial measure. The generalized FRACTAL

dimensions of the normalized LDF signals are then estimated. Furthermore, for the

first time, the authors estimate the generalized FRACTAL dimensions from a range

of scales corresponding to factors influencing the microcirculation flow

(cardiac, respiratory, myogenic, neurogenic, and endothelial).

RESULTS: Different multiFRACTAL behaviors are found between normalized LDF

signals recorded in the forearms and in the hand palms of healthy subjects. Thus,

the variations in the estimated generalized FRACTAL dimensions of LDF signals

recorded in the hand palms are higher than those of LDF signals recorded in the

forearms. This shows that LDF signals recorded in glabrous zones may be more

complex than those recorded in nonglabrous zones. Furthermore, the results show

that the complexity in the hand palms could be more important at scales

corresponding to the myogenic control mechanism than at the other studied scales.

CONCLUSIONS: These findings suggest that the multiFRACTALity of the normalized

LDF signals is different on glabrous and nonglabrous skin. This difference may

rely on the density of arteriovenous anastomoses and differences in nerve supply

or biochemical properties. This study provides useful information for an in-depth

understanding of LDF data and a more detailed knowledge of the peripheral

cardiovascular system.

PMID: 20632594 [PubMed - indexed for MEDLINE]

108. Chaos. 2010 Jun;20(2):023121. doi: 10.1063/1.3427639.

Common multiFRACTALity in the heart rate variability and brain activity of

healthy humans.

Lin DC(1), Sharif A.

Author information:

(1)Department of Mechanical and Industrial Engineering, Ryerson University, Toronto,

Ontario M5B 2K3, Canada.

The influence from the central nervous system on the human multiFRACTAL heart

rate variability (HRV) is examined under the autonomic nervous system

perturbation induced by the head-up-tilt body maneuver. We conducted the

multiFRACTAL factorization analysis to factor out the common multiFRACTAL factor

in the joint fluctuation of the beat-to-beat heart rate and

electroencephalography data. Evidence of a central link in the multiFRACTAL HRV

was found, where the transition towards increased (decreased) HRV multiFRACTAL

complexity is associated with a stronger (weaker) multiFRACTAL correlation

between the central and autonomic nervous systems.

(c) 2010 American Institute of Physics.

PMID: 20590317 [PubMed - indexed for MEDLINE]

109. Genet Mol Res. 2010 May 25;9(2):949-65. doi: 10.4238/vol9-2gmr756.

The Caenorhabditis elegans genome: a multiFRACTAL analysis.

Vélez PE(1), Garreta LE, Martínez E, Díaz N, Amador S, Tischer I, Gutiérrez JM,

Moreno PA.

Author information:

(1)Departamento de Biología, Universidad del Cauca, Popayán, Colombia.

The Caenorhabditis elegans genome has several regular and irregular

characteristics in its nucleotide composition; these are observed within and

between chromosomes. To study these particularities, we carried out a

multiFRACTAL analysis, which requires a large number of exponents to characterize

scaling properties. We looked for a relationship between the genetic information

content of the chromosomes and multiFRACTAL parameters and found less

multiFRACTALity compared to the human genome. Differences in multiFRACTALity

among chromosomes and in regions of chromosomes, and two group averages of

chromosome regions were observed. All these differences were mainly dependent on

differences in the contents of repetitive DNA. Based on these properties, we

propose a nonlinear model for the structure of the C. elegans genome, with some

biological implications. These results suggest that examining differences in

multiFRACTALity is a viable approach for measuring local variations of genomic

information contents along chromosomes. This approach could be extended to other

genomes in order to characterize structural and functional regions of


PMID: 20506082 [PubMed - indexed for MEDLINE]

110. Hum Mov Sci. 2010 Jun;29(3):449-63. doi: 10.1016/j.humov.2009.08.004.

Data series embedding and scale invariant statistics.

Michieli I(1), Medved B, Ristov S.

Author information:

(1)Electronic Department, Ruder Bosković Institute, Zagreb 10000, Croatia.

Data sequences acquired from bio-systems such as human gait data, heart rate

interbeat data, or DNA sequences exhibit complex dynamics that is frequently

described by a long-memory or power-law decay of autocorrelation function. One

way of characterizing that dynamics is through scale invariant statistics or

"FRACTAL-like" behavior. For quantifying scale invariant parameters of

physiological signals several methods have been proposed. Among them the most

common are detrended fluctuation analysis, sample mean variance analyses, power

spectral density analysis, R/S analysis, and recently in the realm of the

multiFRACTAL approach, wavelet analysis. In this paper it is demonstrated that

embedding the time series data in the high-dimensional pseudo-phase space reveals

scale invariant statistics in the simple fashion. The procedure is applied on

different stride interval data sets from human gait measurements time series

(Physio-Bank data library). Results show that introduced mapping adequately

separates long-memory from random behavior. Smaller gait data sets were analyzed

and scale-free trends for limited scale intervals were successfully detected. The

method was verified on artificially produced time series with known scaling

behavior and with the varying content of noise. The possibility for the method to

falsely detect long-range dependence in the artificially generated short range

dependence series was investigated.

(c) 2009 Elsevier B.V. All rights reserved.

PMID: 20435364 [PubMed - indexed for MEDLINE]

111. Zh Obshch Biol. 2010 Mar-Apr;71(2):115-30.

[FRACTAL aspects of the taxic diversity].

[Article in Russian]

Gelashvili DB, Iakimov VN, Iudin DI, Rozenberg GS, Solntsev LA, Saksonov SV,

Snegireva MS.

Two approaches are suggested for describing taxic diversity as a FRACTAL, or

self-similar, object. One of them called "sampling approach" is based on

necessity of taking into account the sampling process and on proceeding from the

real ecological practice of exploration of the community structure. Verification

of this approach is fulfilled using a multiFRACTAL analysis of the generic

diversity of vascular plants of the National Park "Samarskaya Luka". The

previously revealed regularities of multiFRACTAL spectrum of the species

structure of communities are shown to be true to an extent for the generic

structure, as well. The second approach called "topological" one is based on an

abstract representation of the results of evolutionary process in form of

phylogenetic tree characterized by a non-trivial topological structure.

Approbations of this approach is fulfilled by analysis of topological structure

of the taxonomic tree of the class Mammalia, our calculations indicating FRACTAL

properties of its graph. These results make it reasonable to suppose that the

taxic diversity, as a replica of the real diversity of the FRACTALly organized

organic world, also possesses self-similar (FRACTAL) structure.

PMID: 20391749 [PubMed - indexed for MEDLINE]

112. Water Sci Technol. 2010;61(8):2113-8. doi: 10.2166/wst.2010.135.

Comparative analysis of time-scaling properties about water pH in Poyang Lake

Inlet and Outlet on the basis of FRACTAL methods.

Shi K(1), Liu CQ, Huang ZW, Zhang B, Su Y.

Author information:

(1)College of Biology and Environmental Sciences, Jishou University, Jishou Hunan

416000, China.

Detrended fluctuation analysis (DFA) and multiFRACTAL methods are applied to the

time-scaling properties analysis of water pH series in Poyang Lake Inlet and

Outlet in China. The results show that these pH series are characterised by

long-term memory and multiFRACTAL scaling, and these characteristics have obvious

differences between the Lake Inlet and Outlet. The comparison results suggest

that monoFRACTAL and multiFRACTAL parameters can be quantitative dynamical

indexes reflecting the capability of anti-acidification of Poyang Lake.

Furthermore, we investigated the frequency-size distribution of pH series in

Poyang Lake Inlet and Outlet. Our findings suggest that water pH is an example of

a self-organised criticality (SOC) process. The results show that it is different

SOC behaviours that result in the differences of power-law relations between pH

series in Poyang Lake Inlet and Outlet. This work can be helpful to improvement

of modelling of lake water quality.

PMID: 20389010 [PubMed - indexed for MEDLINE]

113. Proc Natl Acad Sci U S A. 2010 Apr 27;107(17):7640-5. doi:

10.1073/pnas.0912983107. Epub 2010 Apr 12.

MultiFRACTAL network generator.

Palla G(1), Lovász L, Vicsek T.

Author information:

(1)Statistical and Biological Physics Research Group of the Hungarian Academy of

Sciences, Eötvös University, Budapest, Hungary.

We introduce a new approach to constructing networks with realistic features. Our

method, in spite of its conceptual simplicity (it has only two parameters) is

capable of generating a wide variety of network types with prescribed statistical

properties, e.g., with degree or clustering coefficient distributions of various,

very different forms. In turn, these graphs can be used to test hypotheses or as

models of actual data. The method is based on a mapping between suitably chosen

singular measures defined on the unit square and sparse infinite networks. Such a

mapping has the great potential of allowing for graph theoretical results for a

variety of network topologies. The main idea of our approach is to go to the

infinite limit of the singular measure and the size of the corresponding graph

simultaneously. A very unique feature of this construction is that with the

increasing system size the generated graphs become topologically more structured.

We present analytic expressions derived from the parameters of the--to be

iterated--initial generating measure for such major characteristics of graphs as

their degree, clustering coefficient, and assortativity coefficient

distributions. The optimal parameters of the generating measure are determined

from a simple simulated annealing process. Thus, the present work provides a tool

for researchers from a variety of fields (such as biology, computer science,

biology, or complex systems) enabling them to create a versatile model of their

network data.

PMCID: PMC2867894

PMID: 20385847 [PubMed - indexed for MEDLINE]

114. Neurology. 2010 Apr 6;74(14):1102-7. doi: 10.1212/WNL.0b013e3181d7d8b4.

FRACTAL analysis of retinal vessels suggests that a distinct vasculopathy causes

lacunar stroke.

Doubal FN(1), MacGillivray TJ, Patton N, Dhillon B, Dennis MS, Wardlaw JM.

Author information:

(1)Division of Clinical Neurosciences, University of Edinburgh, Western General

Hospital, Edinburgh, UK.

Comment in

Neurology. 2010 Apr 6;74(14):1088-9.

OBJECTIVES: Lacunar strokes account for 25% of all ischemic strokes and may

represent the cerebral manifestation of a systemic small vessel vasculopathy of

unknown etiology. Altered retinal vessel FRACTAL dimensions may act as a

surrogate marker for diseased cerebral vessels. We used a cross-sectional study

to investigate FRACTAL properties of retinal vessels in lacunar stroke.

METHODS: We recruited patients presenting with lacunar stroke and patients with

minor cortical stroke as controls. All patients were examined by a stroke expert

and had MRI at presentation. Digital retinal photographs were taken of both eyes.

MonoFRACTAL and multiFRACTAL analyses were performed with custom-written

semiautomated software.

RESULTS: We recruited 183 patients. Seventeen were excluded owing to poor

photographic quality, leaving 166 patients (86 with lacunar and 80 with cortical

stroke). The mean age was 67.3 years (SD 11.5 years). The patients with lacunar

stroke were younger but the prevalence of diabetes, hypertension, and white

matter hyperintensities did not differ between the groups. The mean Dbox

(monoFRACTAL dimension) was 1.42 (SD 0.02), the mean D0 (multiFRACTAL dimension)

1.67 (SD 0.03). With multivariate analysis, decreased Dbox and D0 (both

representing decreased branching complexity) were associated with increasing age

and lacunar stroke subtype after correcting for hypertension, diabetes, stroke

severity, and white matter hyperintensity scores.

CONCLUSIONS: Lacunar stroke subtype and increasing age are associated with

decreased FRACTAL dimensions, suggesting a loss of branching complexity. Further

studies should concentrate on longitudinal associations with other manifestations

of cerebral small vessel disease.

PMCID: PMC2865776

PMID: 20368631 [PubMed - indexed for MEDLINE]

115. Phys Rev E Stat Nonlin Soft Matter Phys. 2010 Feb;81(2 Pt 2):026102. Epub 2010

Feb 8.

Hierarchical multiFRACTAL representation of symbolic sequences and application to

human chromosomes.

Provata A(1), Katsaloulis P.

Author information:

(1)Institute of Physical Chemistry, National Center for Scientific Research

Demokritos, 15310 Athens, Greece.

The two-dimensional density correlation matrix is constructed for symbolic

sequences using contiguous segments of arbitrary size. The multiFRACTAL spectrum

obtained from this matrix motif is shown to characterize the correlations in the

symbolic sequences. This method is applied to entire human chromosomes, shuffled

human chromosomes, reconstructed human genomic sequences and to artificial random

sequences. It is shown that all human chromosomes have common characteristics in

their multiFRACTAL spectrum and deviate substantially from random and

uncorrelated sequences of the same size. Small deviations are observed between

the longer and the shorter chromosomes, especially for the higher (in absolute

values) statistical moments. The correlations are crucial for the form of the

multiFRACTAL spectrum; surrogate shuffled chromosomes present randomlike

spectrum, distinctly different from the actual chromosomes. Analytical approaches

based on hierarchical superposition of tensor products show that retaining pair

correlations in the sequences leads to a closer representation of the genomic

multiFRACTAL spectra, especially in the region of negative exponents, due to the

underrepresentation of various functional units (such as the cytosine-guanine CG

combination and its complementary GC complex). Retaining higher-order

correlations in the construction of the tensor products is a way to approach

closer the structure of the multiFRACTAL spectra of the actual genomic sequences.

This hierarchical approach is generic and is applicable to other correlated

symbolic sequences.

PMID: 20365626 [PubMed - indexed for MEDLINE]

116. Phys Rev E Stat Nonlin Soft Matter Phys. 2009 Dec;80(6 Pt 1):061126. Epub 2009

Dec 18.

MultiFRACTAL analysis of light scattering-intensity fluctuations.

Shayeganfar F(1), Jabbari-Farouji S, Movahed MS, Jafari GR, Tabar MR.

Author information:

(1)Department of Physics, Sharif University of Technology, PO Box 11365-9161,

Tehran, Iran.

We provide a simple interpretation of non-Gaussian nature of the light

scattering-intensity fluctuations from an aging colloidal suspension of Laponite

using the multiplicative cascade model, Markovian method, and volatility

correlations. The cascade model and Markovian method enable us to reproduce most

of recent empirical findings: long-range volatility correlations and non-Gaussian

statistics of intensity fluctuations. We provide evidence that the intensity

increments Deltax(tau)=I(t+tau)-I(t), upon different delay time scales tau, can

be described as a Markovian process evolving in tau. Thus, the tau dependence of

the probability density function p(Deltax,tau) on the delay time scale tau can be

described by a Fokker-Planck equation. We also demonstrate how drift and

diffusion coefficients in the Fokker-Planck equation can be estimated directly

from the data.

PMID: 20365137 [PubMed - indexed for MEDLINE]

117. Phys Rev E Stat Nonlin Soft Matter Phys. 2009 Nov;80(5 Pt 2):056302. Epub 2009

Nov 6.

Large deviation theory for coin tossing and turbulence.

Chakraborty S(1), Saha A, Bhattacharjee JK.

Author information:

(1)Neils Bohr Institute, Niels Bohr International Academy, Blegdamsvej 17, 2100

Copenhagen varphi, Denmark.

Large deviations play a significant role in many branches of nonequilibrium

statistical physics. They are difficult to handle because their effects, though

small, are not amenable to perturbation theory. Even the Gaussian model, which is

the usual initial step for most perturbation theories, fails to be a starting

point while discussing intermittency in fluid turbulence, where large deviations

dominate. Our contention is: in the large deviation theory, the central role is

played by the distribution associated with the tossing of a coin and the simple

coin toss is the "Gaussian model" of problems where rare events play significant

role. We illustrate this by applying it to calculate the multiFRACTAL exponents

of the order structure factors in fully developed turbulence.

PMID: 20365068 [PubMed - indexed for MEDLINE]

118. Physiol Meas. 2010 Apr;31(4):565-80. doi: 10.1088/0967-3334/31/4/008. Epub 2010

Mar 12.

MultiFRACTAL and nonlinear assessment of autonomous nervous system response

during transient myocardial ischaemia.

Magrans R(1), Gomis P, Caminal P, Wagner G.

Author information:

(1)Departament d'Enginyeria de Sistemes, Universitat Politècnica de Catalunya,

Barcelona, Spain.

We assess autonomic nervous system response during prolonged percutaneous

transluminal coronary angioplasty (PTCA) using heart rate variability analysis

with multiFRACTAL indices. These indices are used to evaluate the effects of the

PTCA procedures at different arteries and locations. A total of 55 patients from

the Staff3 database, with no prior history of myocardial infarction, were

included in the study. The indices increased significantly during the transient

ischaemia and reperfusion periods, indicating an increase in nonlinear

multiFRACTAL characteristics and a change in temporal correlations in heartbeat

fluctuations. This indicates that significant multiFRACTAL and nonlinear complex

reactions in the autonomic control of the heart rate occurred during coronary

artery occlusions and suggests that the multiFRACTAL indices may be a promising

nonlinear technique for evaluating autonomic nervous system response in the

presence of transient myocardial ischaemia.

PMID: 20228447 [PubMed - indexed for MEDLINE]

119. Chaos. 2009 Dec;19(4):043129. doi: 10.1063/1.3273187.

Computing the multiFRACTAL spectrum from time series: an algorithmic approach.

Harikrishnan KP(1), Misra R, Ambika G, Amritkar RE.

Author information:

(1)Department of Physics, The Cochin College, Cochin 682 002, India.

We show that the existing methods for computing the f(alpha) spectrum from a time

series can be improved by using a new algorithmic scheme. The scheme relies on

the basic idea that the smooth convex profile of a typical f(alpha) spectrum can

be fitted with an analytic function involving a set of four independent

parameters. While the standard existing schemes [P. Grassberger et al., J. Stat.

Phys. 51, 135 (1988); A. Chhabra and R. V. Jensen, Phys. Rev. Lett. 62, 1327

(1989)] generally compute only an incomplete f(alpha) spectrum (usually the top

portion), we show that this can be overcome by an algorithmic approach, which is

automated to compute the D(q) and f(alpha) spectra from a time series for any

embedding dimension. The scheme is first tested with the logistic attractor with

known f(alpha) curve and subsequently applied to higher-dimensional cases. We

also show that the scheme can be effectively adapted for analyzing practical time

series involving noise, with examples from two widely different real world

systems. Moreover, some preliminary results indicating that the set of four

independent parameters may be used as diagnostic measures are also included.

PMID: 20059225 [PubMed - indexed for MEDLINE]

120. Front Physiol. 2010 Oct 14;1:12. doi: 10.3389/fphys.2010.00012. eCollection 2010.

FRACTAL physiology and the fractional calculus: a perspective.

West BJ.

Author information:

Information Science Directorate, U.S. Army Research Office Research Triangle

Park, NC, USA.

This paper presents a restricted overview of FRACTAL Physiology focusing on the

complexity of the human body and the characterization of that complexity through

FRACTAL measures and their dynamics, with FRACTAL dynamics being described by the

fractional calculus. Not only are anatomical structures (Grizzi and

Chiriva-Internati, 2005), such as the convoluted surface of the brain, the lining

of the bowel, neural networks and placenta, FRACTAL, but the output of dynamical

physiologic networks are FRACTAL as well (Bassingthwaighte et al., 1994). The

time series for the inter-beat intervals of the heart, inter-breath intervals and

inter-stride intervals have all been shown to be FRACTAL and/or multiFRACTAL

statistical phenomena. Consequently, the FRACTAL dimension turns out to be a

significantly better indicator of organismic functions in health and disease than

the traditional average measures, such as heart rate, breathing rate, and stride

rate. The observation that human physiology is primarily FRACTAL was first made

in the 1980s, based on the analysis of a limited number of datasets. We review

some of these phenomena herein by applying an allometric aggregation approach to

the processing of physiologic time series. This straight forward method

establishes the scaling behavior of complex physiologic networks and some dynamic

models capable of generating such scaling are reviewed. These models include

simple and fractional random walks, which describe how the scaling of correlation

functions and probability densities are related to time series data.

Subsequently, it is suggested that a proper methodology for describing the

dynamics of FRACTAL time series may well be the fractional calculus, either

through the fractional Langevin equation or the fractional diffusion equation. A

fractional operator (derivative or integral) acting on a FRACTAL function, yields

another FRACTAL function, allowing us to construct a fractional Langevin equation

to describe the evolution of a FRACTAL statistical process. Control of

physiologic complexity is one of the goals of medicine, in particular,

understanding and controlling physiological networks in order to ensure their

proper operation. We emphasize the difference between homeostatic and allometric

control mechanisms. Homeostatic control has a negative feedback character, which

is both local and rapid. Allometric control, on the other hand, is a relatively

new concept that takes into account long-time memory, correlations that are

inverse power law in time, as well as long-range interactions in complex

phenomena as manifest by inverse power-law distributions in the network variable.

We hypothesize that allometric control maintains the FRACTAL character of erratic

physiologic time series to enhance the robustness of physiological networks.

Moreover, allometric control can often be described using the fractional calculus

to capture the dynamics of complex physiologic networks.

PMCID: PMC3059975

PMID: 21423355 [PubMed]

121. Conf Proc IEEE Eng Med Biol Soc. 2009;2009:1808-11. doi:


MultiFRACTAL characterization of the autonomous nervous system during prolonged

coronary artery occlusion.

Magrans R(1), Gomis P, Caminal P, Wagner G.

Author information:

(1)Dept. ESAII, Universitat Politècnica de Catalunya (UPC), and CIBER de

Bioingeniería, Biomateriales y Nanomedicina (CIBER-BBN), c/ Pau Gargallo 5,

08028, Barcelona, Spain.

We assess the autonomic nervous system response during prolonged percutaneous

transluminal coronary angioplasty (PTCA) by heart rate variability analysis using

multiFRACTAL indices. These indices are also used to evaluate the effects of the

PTCA at different arteries. The indices augmented significantly during transient

ischemia and reperfusion periods indicating an increase of multiFRACTAL degree

and a decrease of the long-range dependence on heartbeat fluctuations. This

indicates that significant multiFRACTAL complex reactions of autonomic control of

the heart rate occurred during coronary artery occlusions. Key words:

multiFRACTAL analysis, heartbeat fluctuations, myocardial ischemia, coronary

artery occlusion.

PMID: 19964563 [PubMed - indexed for MEDLINE]

122. Clin Physiol Funct Imaging. 2010 Jan;30(1):43-50. doi:

10.1111/j.1475-097X.2009.00902.x. Epub 2009 Oct 2.

Influence of smoking abstinence and nicotine replacement therapy on heart rate

and QT time-series.

Lewis MJ(1), Balaji G, Dixon H, Syed Y, Lewis KE.

Author information:

(1)School of Engineering, Swansea University, Wales, UK.

SUMMARY: Many smokers attempt to quit without using nicotine replacement therapy

(NRT) or pharmacotherapy, i.e. 'cold-turkey'. The cardiac implications of this

are important but are incompletely understood. Previous studies have associated

smoking cessation with improvements in heart rate (HR) and its variability, but

its influence on QT time-series is unclear. Furthermore, the relative influence

on these parameters of acute nicotine withdrawal and of NRT has not been

adequately compared. Additional insight might come from analysing the dynamic

(e.g. FRACTAL) properties of electrocardiographic data during different levels of

nicotine exposure. We examined the influence of smoking cessation, during

cold-turkey and subsequent NRT, on HR and QT time-series during 30 days of

smoking abstinence. Seven smokers and sixteen healthy non-smokers received ECG

monitoring at baseline (Day 0). Smokers subsequently refrained from smoking

without using NRT for 24 h, and then received NRT for 29 days. ECG monitoring was

repeated at Days 1, 7, 30. Following smoking cessation we observed that: HR and

rate-corrected QT were both reduced, heart rate variability (HRV) increased

(improved), and QT variability index (QTVI) showed signs of improvement (trend

only). Improvements in HR and QT were maintained throughout NRT use, whilst

improvements in HRV and QTVI were sustained for at least the early stages of NRT.

The dynamic (multiFRACTAL) properties of HR and QT were similar for smokers and

non-smokers, and were unchanged by smoking abstinence or NRT. Our results provide

tentative evidence that electrocardiographic improvements during a cold-turkey

smoking quit attempt (acute nicotine withdrawal) are maintained during NRT


PMID: 19799615 [PubMed - indexed for MEDLINE]

123. Dokl Biol Sci. 2009 Jul-Aug;427:374-7.

MultiFRACTAL analysis of the species structure of helminthic communities of small

mammals in the Samarskaya Luka.

Gelashvili DB(1), Iudin DI, Solntsev LA, Snegireva MS, Rozenberg GS, Evlanov IA,

Kirillova NJ, Kirillov AA.

Author information:

(1)Nizhni Novgorod State University, pr Gagarina 23, Nizhni Novgorod, 603950 Russia.

PMID: 19760887 [PubMed - indexed for MEDLINE]

124. Phys Rev Lett. 2009 Jun 19;102(24):244102. Epub 2009 Jun 17.

Butterfly Floquet spectrum in driven SU(2) systems.

Wang J(1), Gong J.

Author information:

(1)Temasek Laboratories, National University of Singapore, 117542, Singapore.

The Floquet spectrum of a class of driven SU(2) systems is shown to display a

butterfly pattern with multiFRACTAL properties. The level crossing between

Floquet states of the same parity or different parities is studied. The results

are relevant to studies of FRACTAL statistics, quantum chaos, coherent

destruction of tunneling, and the validity of mean-field descriptions of

Bose-Einstein condensates.

PMID: 19659010 [PubMed]

125. J Exp Psychol Hum Percept Perform. 2009 Aug;35(4):1072-91. doi: 10.1037/a0015017.

Spatiotemporal symmetry and multiFRACTAL structure of head movements during

dyadic conversation.

Ashenfelter KT(1), Boker SM, Waddell JR, Vitanov N.

Author information:

(1)U.S. Census Bureau, USA.

This study examined the influence of sex, social dominance, and context on

motion-tracked head movements during dyadic conversations. Windowed

cross-correlation analyses found high peak correlation between conversants' head

movements over short ( approximately 2-s) intervals and a high degree of

nonstationarity. Nonstationarity in head movements was found to be positively

related to the number of men in a conversation. Surrogate data analysis

offsetting the conversants' time series by a large lag was unable to reject the

null hypothesis that the observed high peak correlations were unrelated to

short-term coordination between conversants. One way that high peak correlations

could be observed when 2 time series are offset by a large time lag is for each

time series to exhibit self-similarity over a range of scales. MultiFRACTAL

analysis found small-scale fluctuations to be persistent, tau(q) < 0.5, and

large-scale fluctuations to be antipersistent, tau(q) > 0.5. These results are

consistent with a view that symmetry is formed between conversants over short

intervals and that this symmetry is broken at longer, irregular intervals.

PMID: 19653750 [PubMed - indexed for MEDLINE]

126. J Neurosci Methods. 2009 Dec 15;185(1):116-24. doi:

10.1016/j.jneumeth.2009.07.027. Epub 2009 Jul 29.

Spatio-temporal analysis of monoFRACTAL and multiFRACTAL properties of the human

sleep EEG.

Weiss B(1), Clemens Z, Bódizs R, Vágó Z, Halász P.

Author information:

(1)Faculty of Information Technology, Pázmány Péter Catholic University, Budapest,


FRACTALity is a common property in nature. It can also be observed in time series

representing dynamics of complex processes. Therefore FRACTAL analysis could be a

useful tool to describe the dynamics of brain electrical activities in

physiological and pathological conditions. In this study, we carried out a

spatio-temporal analysis of monoFRACTAL and multiFRACTAL properties of

whole-night sleep EEG recordings. We estimated the Hurst exponent (H) and the

range of FRACTAL spectra (dD) in 10 healthy subjects. We found higher H values

during NREM4 compared to NREM2 and REM in all electrodes. Measure dD showed an

opposite trend. Differences of H and dD between NREM2 and REM reached

significancy at circumscribed regions only. Our results contribute to a deeper

understanding of the FRACTAL nature of brain electrical activities and may have

implications for automatic classification of sleep stages.

PMID: 19646476 [PubMed - indexed for MEDLINE]

127. Eur Biophys J. 2009 Oct;38(8):1115-25. doi: 10.1007/s00249-009-0516-z. Epub 2009

Jul 18.

FRACTAL analysis and ionic dependence of endocytotic membrane activity of human

breast cancer cells.

Krasowska M(1), Grzywna ZJ, Mycielska ME, Djamgoz MB.

Author information:

(1)Division of Cell and Molecular Biology, Neuroscience Solutions to Cancer Research

Group, Imperial College London, Sir Alexander Fleming Building, South Kensington

Campus, London SW7 2AZ, UK.

The endocytic membrane activities of two human breast cancer cell lines

(MDA-MB-231 and MCF-7) of strong and weak metastatic potential, respectively,

were studied in a comparative approach. Uptake of horseradish peroxidase was used

to follow endocytosis. Dependence on ionic conditions and voltage-gated sodium

channel (VGSC) activity were characterized. FRACTAL methods were used to analyze

quantitative differences in vesicular patterning. Digital quantification showed

that MDA-MB-231 cells took up more tracer (i.e., were more endocytic) than MCF-7

cells. For the former, uptake was totally dependent on extracellular Na(+) and

partially dependent on extracellular and intracellular Ca(2+) and protein kinase

activity. Analyzing the generalized FRACTAL dimension (D(q )) and its Legendre

transform f(alpha) revealed that under control conditions, all multiFRACTAL

parameters determined had values greater for MDA-MB-231 compared with MCF-7

cells, consistent with endocytic/vesicular activity being more developed in the

strongly metastatic cells. All FRACTAL parameters studied were sensitive to the

VGSC blocker tetrodotoxin (TTX). Some of the parameters had a "simple" dependence

on VGSC activity, if present, whereby pretreatment with TTX reduced the values

for the MDA-MB-231 cells and eliminated the differences between the two cell

lines. For other parameters, however, there was a "complex" dependence on VGSC

activity. The possible physical/physiological meaning of the mathematical

parameters studied and the nature of involvement of VGSC activity in control of

endocytosis/secretion are discussed.

PMID: 19618177 [PubMed - indexed for MEDLINE]

128. Chaos. 2009 Jun;19(2):028507. doi: 10.1063/1.3152223.

MultiFRACTALity and heart rate variability.

Sassi R(1), Signorini MG, Cerutti S.

Author information:

(1)Dipartimento di Tecnologie dell'Informazione, Universita degli studi di Milano,

via Bramante 65, 26013 Crema, Italy.

In this paper, we participate to the discussion set forth by the editor of Chaos

for the controversy, "Is the normal heart rate chaotic?" Our objective was to

debate the question, "Is there some more appropriate term to characterize the

heart rate variability (HRV) fluctuations?" We focused on the approximately 24 h

RR series prepared for this topic and tried to verify with two different

techniques, generalized structure functions and wavelet transform modulus maxima,

if they might be described as being multiFRACTAL. For normal and congestive heart

failure subjects, the h(q) exponents showed to be decreasing for increasing q

with both methods, as it should be for multiFRACTAL signals. We then built 40

surrogate series to further verify such hypothesis. For most of the series

(approximately 75%-80% of cases) multiFRACTALity stood the test of the surrogate

data employed. On the other hand, series coming from patients in atrial

fibrillation showed a small, if any, degree of multiFRACTALity. The population

analyzed is too small for definite conclusions, but the study supports the use of

multiFRACTAL series to model HRV. Also it suggests that the regulatory action of

autonomous nervous system might play a role in the observed multiFRACTALity.

PMID: 19566282 [PubMed - indexed for MEDLINE]

129. Chaos. 2009 Jun;19(2):028503. doi: 10.1063/1.3152006.

Normal heartbeat series are nonchaotic, nonlinear, and multiFRACTAL: new evidence

from semiparametric and parametric tests.

Baillie RT(1), Cecen AA, Erkal C.

Author information:

(1)Departments of Economics and Finance, Michigan State University, East Lansing,

Michigan 48824, USA.

We present new evidence that normal heartbeat series are nonchaotic, nonlinear,

and multiFRACTAL. In addition to considering the largest Lyapunov exponent and

the correlation dimension, the results of the parametric and semiparametric

estimation of the long memory parameter (long-range dependence) unambiguously

reveal that the underlying process is nonstationary, multiFRACTAL, and has strong


PMID: 19566278 [PubMed - indexed for MEDLINE]

130. Chaos. 2009 Jun;19(2):028501. doi: 10.1063/1.3156832.

Introduction to controversial topics in nonlinear science: is the normal heart

rate chaotic?

Glass L.

Author information:

Department of Physiology, McGill University, Montreal, Quebec H3G 1Y6, Canada.

In June 2008, the editors of Chaos decided to institute a new section to appear

from time to time that addresses timely and controversial topics related to

nonlinear science. The first of these deals with the dynamical characterization

of human heart rate variability. We asked authors to respond to the following

questions: Is the normal heart rate chaotic? If the normal heart rate is not

chaotic, is there some more appropriate term to characterize the fluctuations

(e.g., scaling, FRACTAL, multiFRACTAL)? How does the analysis of heart rate

variability elucidate the underlying mechanisms controlling the heart rate? Do

any analyses of heart rate variability provide clinical information that can be

useful in medical assessment (e.g., in helping to assess the risk of sudden

cardiac death)? If so, please indicate what additional clinical studies would be

useful for measures of heart rate variability to be more broadly accepted by the

medical community. In addition, as a challenge for analysis methods, PhysioNet

[A. L. Goldberger et al., "PhysioBank, PhysioToolkit, and PhysioNet: Components

of a new research resource for complex physiologic signals," Circulation 101,

e215-e220 (2000)] provided data sets from 15 patients of whom five were normal,

five had heart failure, and five had atrial fibrillation

( This introductory essay summarizes

the main issues and introduces the essays that respond to these questions.

PMID: 19566276 [PubMed - indexed for MEDLINE]

131. Chaos. 2009 Jun;19(2):026108. doi: 10.1063/1.3143035.

Understanding the complexity of human gait dynamics.

Scafetta N(1), Marchi D, West BJ.

Author information:

(1)Department of Physics, Duke University, Durham, North Carolina 27708, USA.

Time series of human gait stride intervals exhibit FRACTAL and multiFRACTAL

properties under several conditions. Records from subjects walking at normal,

slow, and fast pace speed are analyzed to determine changes in the FRACTAL

scalings as a function of the stress condition of the system. Records from

subjects with different age from children to elderly and patients suffering from

neurodegenerative disease are analyzed to determine changes in the FRACTAL

scalings as a function of the physical maturation or degeneration of the system.

A supercentral pattern generator model is presented to simulate the above two

properties that are typically found in dynamical network performance: that is,

how a dynamical network responds to stress and to evolution.

PMID: 19566268 [PubMed - indexed for MEDLINE]

132. Chaos. 2009 Jun;19(2):026101. doi: 10.1063/1.3155067.

Introduction to focus issue: bipedal locomotion--from robots to humans.

Milton JG.

Author information:

Joint Science Department, The Claremont Colleges, 925 N. Mills Ave., Claremont,

California 91711, USA.

Running and walking, collectively referred to as bipedal locomotion, represent

self-organized behaviors generated by a spatially distributed dynamical system

operating under the constraint that a person must be able to move without falling

down. The organizing principles involve both forces actively regulated by the

nervous system and those generated passively by the biomechanical properties of

the musculoskeletal system and the environment in which the movements occur. With

the development of modern motion capture and electrophysiological techniques it

has become possible to explore the dynamical interplay between the passive and

active controllers of locomotion in a manner that directly compares observation

to predictions made by relevant mathematical and computer models. Consequently,

many of the techniques initially developed to study nonlinear dynamical systems,

including stability analyses, phase resetting and entrainment properties of limit

cycles, and FRACTAL and multiFRACTAL analysis, have come to play major roles in

guiding progress. This Focus Issue discusses bipedal locomotion from the point of

view of dynamical systems theory with the goal of stimulating discussion between

the dynamical systems, physics, biomechanics, and neuroscience communities.

PMID: 19566261 [PubMed - indexed for MEDLINE]

133. Med Image Anal. 2009 Aug;13(4):634-49. doi: 10.1016/ Epub

2009 May 27.

FRACTAL and multiFRACTAL analysis: a review.

Lopes R(1), Betrouni N.

Author information:

(1)Inserm, U703, Pavillon Vancostenobel, CHRU Lille, Lille Cedex 59037, France.

Over the last years, FRACTAL and multiFRACTAL geometries were applied extensively

in many medical signal (1D, 2D or 3D) analysis applications like pattern

recognition, texture analysis and segmentation. Application of this geometry

relies heavily on the estimation of the FRACTAL features. Various methods were

proposed to estimate the FRACTAL dimension or multiFRACTAL spectral of a signal.

This article presents an overview of these algorithms, the way they work, their

benefits and their limits. The aim of this review is to explain and to categorize

the various algorithms into groups and their application in the field of medical

signal analysis.

PMID: 19535282 [PubMed - indexed for MEDLINE]

134. Phys Rev E Stat Nonlin Soft Matter Phys. 2009 Apr;79(4 Pt 2):046111. Epub 2009

Apr 30.

Drip paintings and FRACTAL analysis.

Jones-Smith K(1), Mathur H, Krauss LM.

Author information:

(1)Department of Physics, Case Western Reserve University, Cleveland, Ohio

44106-7079, USA.

It has been claimed that FRACTAL analysis can be applied to unambiguously

characterize works of art such as the drip paintings of Jackson Pollock. This

academic issue has become of more general interest following the recent discovery

of a cache of disputed Pollock paintings. We definitively demonstrate here, by

analyzing paintings by Pollock and others, that FRACTAL criteria provide no

information about artistic authenticity. This work has led us to a result in

FRACTAL analysis of more general scientific significance: we show that the

statistics of the "covering staircase" (closely related to the box-counting

staircase) provide a way to characterize geometry and distinguish FRACTALs from

Euclidean objects. Finally we present a discussion of the composite of two

FRACTALs, a problem that was first investigated by Muzy. We show that the

composite is not generally scale invariant and that it exhibits complex

multiFRACTAL scaling in the small distance asymptotic limit.

PMID: 19518305 [PubMed]

135. Phys Rev E Stat Nonlin Soft Matter Phys. 2009 Apr;79(4 Pt 1):041920. Epub 2009

Apr 21.

Levels of complexity in scale-invariant neural signals.

Ivanov PCh(1), Ma QD, Bartsch RP, Hausdorff JM, Nunes Amaral LA,

Schulte-Frohlinde V, Stanley HE, Yoneyama M.

Author information:

(1)Department of Physics and Center for Polymer Studies, Boston University, and

Division of Sleep Medicine, Brigham and Women's Hospital, Boston, Massachusetts

02115, USA.

Many physical and physiological signals exhibit complex scale-invariant features

characterized by 1/f scaling and long-range power-law correlations, indicating a

possibly common control mechanism. Specifically, it has been suggested that

dynamical processes, influenced by inputs and feedback on multiple time scales,

may be sufficient to give rise to 1/f scaling and scale invariance. Two examples

of physiologic signals that are the output of hierarchical multiscale physiologic

systems under neural control are the human heartbeat and human gait. Here we show

that while both cardiac interbeat interval and gait interstride interval time

series under healthy conditions have comparable 1/f scaling, they still may

belong to different complexity classes. Our analysis of the multiFRACTAL scaling

exponents of the fluctuations in these two signals demonstrates that in contrast

to the multiFRACTAL behavior found in healthy heartbeat dynamics, gait time

series exhibit less complex, close to monoFRACTAL behavior. Further, we find

strong anticorrelations in the sign and close to random behavior for the

magnitude of gait fluctuations at short and intermediate time scales, in contrast

to weak anticorrelations in the sign and strong positive correlation for the

magnitude of heartbeat interval fluctuations-suggesting that the neural

mechanisms of cardiac and gait control exhibit different linear and nonlinear

features. These findings are of interest because they underscore the limitations

of traditional two-point correlation methods in fully characterizing

physiological and physical dynamics. In addition, these results suggest that

different mechanisms of control may be responsible for varying levels of

complexity observed in physiological systems under neural regulation and in

physical systems that possess similar 1/f scaling.

PMID: 19518269 [PubMed - indexed for MEDLINE]

136. Phys Rev Lett. 2009 Mar 13;102(10):106406. Epub 2009 Mar 13.

MultiFRACTAL analysis with the probability density function at the

three-dimensional anderson transition.

Rodriguez A(1), Vasquez LJ, Römer RA.

Author information:

(1)Department of Physics and Centre for Scientific Computing, University of Warwick,

Coventry, CV4 7AL, United Kingdom.

The probability density function (PDF) for critical wave function amplitudes is

studied in the three-dimensional Anderson model. We present a formal expression

between the PDF and the multiFRACTAL spectrum f(alpha) in which the role of

finite-size corrections is properly analyzed. We show the non-Gaussian nature and

the existence of a symmetry relation in the PDF. From the PDF, we extract

information about f(alpha) at criticality such as the presence of negative

FRACTAL dimensions and the possible existence of termination points. A PDF-based

multiFRACTAL analysis is shown to be a valid alternative to the standard approach

based on the scaling of inverse participation ratios.

PMID: 19392138 [PubMed]

137. Phys Rev E Stat Nonlin Soft Matter Phys. 2009 Jan;79(1 Pt 2):016104. Epub 2009

Jan 8.

Free-electron gas in the Apollonian network: multiFRACTAL energy spectrum and its

thermodynamic fingerprints.

de Oliveira IN(1), de Moura FA, Lyra ML, Andrade JS Jr, Albuquerque EL.

Author information:

(1)Instituto de Física, Universidade Federal de Alagoas, 57072-970 Maceió-AL,


We study the free-electron gas in an Apollonian network within the tight-binding

framework. The scale-free and small-world character of the underlying lattice is

known to result in a quite structured energy spectrum with deltalike

singularities, gaps, and minibands. After an exact numerical diagonalization of

the corresponding adjacency matrix of the network with a finite number of

generations, we employ a scaling analysis of the moments of the density of states

to characterize its multiFRACTALity and report the associated singularity

spectrum. The FRACTAL nature of the energy spectrum is also shown to be reflected

in the thermodynamic behavior by logarithmic modulations on the temperature

dependence of the specific heat. The absence of chiral symmetry of the Apollonian

network leads to distinct thermodynamic behaviors due to electrons and holes

thermal excitations.

PMID: 19257104 [PubMed]

138. Braz J Biol. 2008 Nov;68(4 Suppl):1003-12.

Comments about some species abundance patterns: classic, neutral, and niche

partitioning models.

Ferreira FC(1), Petrere M Jr.

Author information:

(1)Departamento de Ecologia, Universidade Estadual Paulista - UNESP,CP 199, CEP

13506-900, Rio Claro, SP, Brazil.

The literature on species abundance models is extensive and a great deal of new

and important contributions have been published in the last three decades.

Broadly speaking, one can recognize five families of species abundance models: i)

purely statistical or classic models (Broken-stick, Log-normal, Logarithmic and

Geometric series); ii) branching process (Zipf-Mandelbrot and FRACTAL branching

models); iii) population dynamics (Neutral models included); iv) spatial

distribution of individuals (MultiFRACTAL and HEAP models) and v) niche

partitioning (Sugihara's breakage and Tokeshi models). Among these the neutral,

the classic and the niche partitioning models have been the most applied to

natural communities, the former having been more extensively discussed than the

others in the last years. The objective of this paper is to comment some aspects

of the classic, neutral and niche partitioning models in a way that the proposed

distributions may contribute to the analysis of the empirical patterns of species

abundance. In spite of the variety of models, the distributions in general vary

between the log-normal and the logarithmic series. From these models the

Power-Fraction, together with independent niche dimensions measures, are amenable

to experimental tests and may offer answers on which resources are important in

the structuring of biological communities.

PMID: 19197471 [PubMed - indexed for MEDLINE]

139. Conf Proc IEEE Eng Med Biol Soc. 2008;2008:3912-5. doi:


3D mutiFRACTAL analysis: a new tool for epileptic fit sources detection in SPECT


Lopes R(1), Viard R, Dewalle AS, Steinling M, Maouche S, Betrouni N.

Author information:

(1)INSERM U703, Lille, France.

One of the imaging modalities used for the diagnosis of epilepsy is SPECT

(Single-Photon Emission Computed Tomography). Ictal and interictal images are

registered to MR images (SISCOM (Substracted Ictal Spect COregistred to MR) to

delineate the sources. However, in some cases and for many reasons, the used

method does not lead to precise delimitation of epileptic fit sources. In this

case, works have been investigated on group's studies or in combining others

modalities like EEG (Electroencephalography). This study investigates the

possibility of using a mathematic model for the image texture to detect the

changes on SPECT images. Beyond encouraging preliminary results concerning the

multiFRACTAL analysis to distinguish volunteers and epileptic patients, our aim

was to detect sources by the singularity spectrum compute. The experiment is

divided into two phases. First, we developed a 3D method for the singularity

spectrum compute. In the test phase, we applied this multiFRACTAL spectrum to the

sources detection on SPECT images. The results obtained on a base of seven

patients show that the proposed method is encouraging. Indeed, the detections of

epileptic fit sources obtained were in agree with the expert diagnostic.

PMID: 19163568 [PubMed - indexed for MEDLINE]

140. Sensors (Basel). 2009;9(11):8669-83. doi: 10.3390/s91108669. Epub 2009 Oct 29.

Super-resolution reconstruction of remote sensing images using multiFRACTAL


Hu MG(1), Wang JF, Ge Y.

Author information:

(1)Institute of Geographic Sciences & Nature Resources Research, Chinese Academy of

Sciences, Beijing, China; E-Mails: (M.H.);


Satellite remote sensing (RS) is an important contributor to Earth observation,

providing various kinds of imagery every day, but low spatial resolution remains

a critical bottleneck in a lot of applications, restricting higher spatial

resolution analysis (e.g., intra-urban). In this study, a multiFRACTAL-based

super-resolution reconstruction method is proposed to alleviate this problem. The

multiFRACTAL characteristic is common in Nature. The self-similarity or

self-affinity presented in the image is useful to estimate details at larger and

smaller scales than the original. We first look for the presence of multiFRACTAL

characteristics in the images. Then we estimate parameters of the information

transfer function and noise of the low resolution image. Finally, a noise-free,

spatial resolution-enhanced image is generated by a FRACTAL coding-based

denoising and downscaling method. The empirical case shows that the reconstructed

super-resolution image performs well in detail enhancement. This method is not

only useful for remote sensing in investigating Earth, but also for other images

with multiFRACTAL characteristics.

PMCID: PMC3260607

PMID: 22291530 [PubMed]

141. Chaos. 2008 Sep;18(3):033115. doi: 10.1063/1.2965502.

MultiFRACTAL and statistical analyses of heat release fluctuations in a spark

ignition engine.

Sen AK(1), Litak G, Kaminski T, Wendeker M.

Author information:

(1)Department of Mathematical Sciences, Indiana University, 402 North Blackford

Street, Indianapolis, Indiana 46202, USA.

Using multiFRACTAL and statistical analyses, we have investigated the complex

dynamics of cycle-to-cycle heat release variations in a spark ignition engine.

Three different values of the spark advance angle (Delta beta) are examined. The

multiFRACTAL complexity is characterized by the singularity spectrum of the heat

release time series in terms of the Holder exponent. The broadness of the

singularity spectrum gives a measure of the degree of mutiFRACTALity or

complexity of the time series. The broader the spectrum, the richer and more

complex is the structure with a higher degree of multiFRACTALity. Using this

broadness measure, the complexity in heat release variations is compared for the

three spark advance angles (SAAs). Our results reveal that the heat release data

are most complex for Delta beta=30 degrees followed in order by Delta beta=15

degrees and 5 degrees. In other words, the complexity increases with increasing

SAA. In addition, we found that for all the SAAs considered, the heat release

fluctuations behave like an antipersistent or a negatively correlated process,

becoming more antipersistent with decreasing SAA. We have also performed a

statistical analysis of the heat release variations by calculating the kurtosis

of their probability density functions (pdfs). It is found that for the smallest

SAA considered, Delta beta=5 degrees, the pdf is nearly Gaussian with a kurtosis

of 3.42. As the value of the SAA increases, the pdf deviates from a Gaussian

distribution and tends to be more peaked with larger values of kurtosis. In

particular, the kurtosis has values of 3.94 and 6.69, for Delta beta=15 degrees

and 30 degrees, respectively. A non-Gaussian density function with kurtosis in

excess of 3 is indicative of intermittency. A larger value of kurtosis implies a

higher degree of intermittency.

(c) 2008 American Institute of Physics.

PMID: 19045453 [PubMed - indexed for MEDLINE]

142. J Appl Clin Med Phys. 2008 Nov 11;9(4):2741.

A novel algorithm for initial lesion detection in ultrasound breast images.

Yap MH(1), Edirisinghe EA, Bez HE.

Author information:

(1)Department of Computer Science, Loughborough University, Loughborough, U.K.

This paper proposes a novel approach to initial lesion detection in ultrasound

breast images. The objective is to automate the manual process of region of

interest (ROI) labeling in computer-aided diagnosis (CAD). We propose the use of

hybrid filtering, multiFRACTAL processing, and thresholding segmentation in

initial lesion detection and automated ROI labeling. We used 360 ultrasound

breast images to evaluate the performance of the proposed approach. Images were

preprocessed using histogram equalization before hybrid filtering and

multiFRACTAL analysis were conducted. Subsequently, thresholding segmentation was

applied on the image. Finally, the initial lesions are detected using a

rule-based approach. The accuracy of the automated ROI labeling was measured as

an overlap of 0.4 with the lesion outline as compared with lesions labeled by an

expert radiologist. We compared the performance of the proposed method with that

of three state-of-the-art methods, namely, the radial gradient index filtering

technique, the local mean technique, and the FRACTAL dimension technique. We

conclude that the proposed method is more accurate and performs more effectively

than do the benchmark algorithms considered.

PMID: 19020477 [PubMed - indexed for MEDLINE]

143. Philos Trans A Math Phys Eng Sci. 2009 Jan 28;367(1887):277-96. doi:


Methods derived from nonlinear dynamics for analysing heart rate variability.

Voss A(1), Schulz S, Schroeder R, Baumert M, Caminal P.

Author information:

(1)Department of Medical Engineering and Biotechnology, University of Applied

Sciences Jena, 07745 Jena, Germany.

Methods from nonlinear dynamics (NLD) have shown new insights into heart rate

(HR) variability changes under various physiological and pathological conditions,

providing additional prognostic information and complementing traditional time-

and frequency-domain analyses. In this review, some of the most prominent indices

of nonlinear and FRACTAL dynamics are summarized and their algorithmic

implementations and applications in clinical trials are discussed. Several of

those indices have been proven to be of diagnostic relevance or have contributed

to risk stratification. In particular, techniques based on mono- and multiFRACTAL

analyses and symbolic dynamics have been successfully applied to clinical

studies. Further advances in HR variability analysis are expected through

multidimensional and multivariate assessments. Today, the question is no longer

about whether or not methods from NLD should be applied; however, it is relevant

to ask which of the methods should be selected and under which basic and

standardized conditions should they be applied.

PMID: 18977726 [PubMed - indexed for MEDLINE]

144. Dokl Biol Sci. 2008 Jul-Aug;421:257-61.

MultiFRACTAL analysis of the species structure of small-mammal communities of the

Volga-Ural paleobiocenosis.

Gelashvili DB(1), Dmitriev AI, Iudin DI, Rozenberg GS, Solntsev LA.

Author information:

(1)Faculty of Biology, Nizhni Novgorod State University, pr. Gagarina 23, Nizhni

Novgorod, 603950 Russia.

PMID: 18841809 [PubMed - indexed for MEDLINE]

145. Am Nat. 2002 Feb;159(2):138-55. doi: 10.1086/324787.

Species-area curves, diversity indices, and species abundance distributions: a

multiFRACTAL analysis.

Borda-de-Agua L(1), Hubbell SP, McAllister M.

Author information:

(1)Center for Environmental Research and Conservation, Columbia University, New

York, New York 10027, USA.

Although FRACTALs have been applied in ecology for some time, multiFRACTALs have,

in contrast, received little attention. In this article, we apply multiFRACTALs

to the species-area relationship and species abundance distributions. We

highlight two results: first, species abundance distributions collected at

different spatial scales may collapse into a single curve after appropriate

renormalization, and second, the power-law form of the species-area relationship

and the Shannon, Simpson, and Berger-Parker diversity indices belong to a family

of equations relating the species number, species abundance, and area through the

moments of the species abundance-probability density function. Explicit formulas

for these diversity indices, as a function of area, are derived. Methods to

obtain the multiFRACTAL spectra from a data set are discussed, and an example is

shown with data on tree and shrub species collected in a 50-ha plot on Barro

Colorado Island, Panama. Finally, we discuss the implications of the multiFRACTAL

formalism to the relationship between species range and abundance and the

relation between the shape of the species abundance distribution and area.

PMID: 18707410 [PubMed]

146. J Neurosci Methods. 2008 Sep 30;174(2):292-300. doi:

10.1016/j.jneumeth.2008.06.037. Epub 2008 Jul 23.

Endogenous multiFRACTAL brain dynamics are modulated by age, cholinergic blockade

and cognitive performance.

Suckling J(1), Wink AM, Bernard FA, Barnes A, Bullmore E.

Author information:

(1)Brain Mapping Unit, University of Cambridge, Department of Psychiatry,

Addenbrooke's Hospital, Cambridge CB2 0QQ, UK.

The intuitive notion that a healthy organism is characterised by regular,

homeostatic function has been challenged by observations that a loss of

complexity is, in fact, indicative of ill-health. MonoFRACTALs succinctly

describe complex processes and are controlled by a single time-invariant scaling

exponent, H, simply related to the FRACTAL dimension. Previous analyses of

resting fMRI time-series demonstrated that ageing and scopolamine administration

were both associated with increases in H and that faster response in a prior

encoding task was also associated with increased H. We revisit this experiment

with a novel, multiFRACTAL approach in which FRACTAL dynamics are assumed to be

non-stationary and defined by a spectrum of local singularity exponents.

Parameterisation of this spectrum was capable of refracting the effects of age,

scopolamine and task performance as well as a refining a description of the

associated signal changes. Using the same imaging data, we also explored

turbulence as a possible mechanism underlying multiFRACTAL dynamics. Evidence is

provided that Carstaing's model of turbulent information flow from high to low

scales has only limited validity, and that scale invariance of energy dissipation

is better explained by critical-phase phenomena, supporting the proposition that

the brain maintains a state of self-organised criticality.

PMCID: PMC2590659

PMID: 18703089 [PubMed - indexed for MEDLINE]

147. J Pharmacol Toxicol Methods. 2008 Sep-Oct;58(2):118-28. doi:

10.1016/j.vascn.2008.05.005. Epub 2008 Jul 10.

Heartbeat dynamics in adrenergic blocker treated conscious beagle dogs.

Li D(1), Chiang AY, Clawson CA, Main BW, Leishman DJ.

Author information:

(1)Global Statistical Sciences and Toxicology, Lilly Research Laboratories, Eli

Lilly and Company, Greenfield, IN 46140, USA.

INTRODUCTION: Adrenergic blockade as a treatment for chronic heart failure (CHF)

has proved effective, but its pharmacological mechanism on CHF remains unclear.

In the past two decades, studies on heart rate variability (HRV) have reported

that CHF patients generally have a reduced temporal complexity in heart rate

variability. On the other hand, adrenergic blockers have been shown to restore

such complexity. FRACTAL analysis is a novel and efficient tool to explore the

adrenergic blockade effect on HRV. This paper applies the detrended fluctuation

analysis (DFA) and multiFRACTAL DFA (MF-DFA) methods in an attempt to understand

the effect of adrenergic blockade on cardiac dynamics in conscious beagle dogs.

METHODS: DFA and MF-DFA analysis are conducted on RR interval data generated from

telemetry instrumented dogs receiving a combination of 15 mg/kg nadolol and 5

mg/kg phenoxybenzamine orally administered at the 22nd and 34th hour in a

parallel design (n=12). All dogs had approximately 48 h of beat-to-beat heart

rate measurements recorded in the left ventricle. Complexity measures for

heartbeat series are compared between the blocker and vehicle group. We also

compute traditional statistics for HRV and spectral parameters and examine their

correlation with FRACTAL analysis.

RESULTS: When compared to the vehicle group, the adrenergic blocker group had: 1)

longer RR intervals (p=0.02) and lower beat-to-beat variability (p=0.04); 2)

decreased low frequency (LF) and high frequency (HF) power (p=0.03), and higher

LF-to-HF ratio; 3) larger middle-range scaling exponents (p<0.01); 4) broader

multiFRACTAL spectra (p=0.03) with higher dominant singularity indices (p=0.02).

DISCUSSION: Our results show that 1) adrenergic blockade alters the sympathovagal

balance; 2) adrenergic blockers enhance the complexity of the cardiac dynamics;

3) the adrenergic blockade effect on cardiac dynamics is primarily the

attenuation of small fluctuations in RR intervals. FRACTAL analysis also has the

potential to be applied to early QT diagnosis.

PMID: 18619862 [PubMed - indexed for MEDLINE]

148. Phys Rev E Stat Nonlin Soft Matter Phys. 2008 Apr;77(4 Pt 2):045101. Epub 2008

Apr 18.

Self-affine FRACTALs embedded in spectra of complex networks.

Yang H(1), Yin C, Zhu G, Li B.

Author information:

(1)Department of Physics and Centre for Computational Science and Engineering,

National University of Singapore, Singapore 117542.

The scaling properties of spectra of real world complex networks are studied by

using the wavelet transform. It is found that the spectra of networks are

multiFRACTAL. According to the values of the long-range correlation exponent, the

Hust exponent H, the networks can be classified into three types, namely, H>0.5,

H=0.5, and H<0.5. All real world networks considered belong to the class of

H>or=0.5, which may be explained by the hierarchical properties.

PMID: 18517677 [PubMed]

149. Phys Rev E Stat Nonlin Soft Matter Phys. 2008 Mar;77(3 Pt 2):036210. Epub 2008

Mar 18.

Anomaly of FRACTAL dimensions observed in stochastically switched systems.

Nishikawa J(1), Gohara K.

Author information:

(1)Department of Applied Physics, Hokkaido University, Sapporo, Hokkaido, Japan.

We studied an anomaly in FRACTAL dimensions measured from the attractors of

dynamical systems driven by stochastically switched inputs. We calculated the

dimensions for different switching time lengths in two-dimensional linear

dynamical systems, and found that changes in the dimensions due to the switching

time length had a singular point when the system matrix had two different real

eigenvalues. Using partial dimensions along each eigenvector, we explicitly

derived a generalized dimension D(q) and a multiFRACTAL spectrum f(alpha) to

explain this anomalous property. The results from numerical calculations agreed

well with those from analytical equations. We found that this anomaly is caused

by linear independence, inhomogeneity of eigenvalues, and overlapping conditions.

The mechanism for the anomaly could be identified for various inhomogeneous

systems including nonlinear ones, and this reminded us of anomalies in some

physical values observed in critical phenomena.

PMID: 18517488 [PubMed]

150. Phys Rev E Stat Nonlin Soft Matter Phys. 2008 Mar;77(3 Pt 2):036205. Epub 2008

Mar 11.

FRACTAL Weyl law for chaotic microcavities: Fresnel's laws imply multiFRACTAL


Wiersig J(1), Main J.

Author information:

(1)Institut für Theoretische Physik, Universität Bremen, Bremen, Germany.

We demonstrate that the harmonic inversion technique is a powerful tool to

analyze the spectral properties of optical microcavities. As an interesting

example we study the statistical properties of complex frequencies of the fully

chaotic microstadium. We show that the conjectured FRACTAL Weyl law for open

chaotic systems [Lu, Phys. Rev. Lett. 91, 154101 (2003)] is valid for dielectric

microcavities only if the concept of the chaotic repeller is extended to a

multiFRACTAL by incorporating Fresnel's laws.

PMID: 18517483 [PubMed]

151. Hum Brain Mapp. 2008 Jul;29(7):791-801. doi: 10.1002/hbm.20593.

MonoFRACTAL and multiFRACTAL dynamics of low frequency endogenous brain

oscillations in functional MRI.

Wink AM(1), Bullmore E, Barnes A, Bernard F, Suckling J.

Author information:

(1)Brain Mapping Unit, Department of Psychiatry, Addenbrooke's Hospital, University

of Cambridge, Cambridge, United Kingdom.

FRACTAL processes, like trees or coastlines, are defined by self-similarity or

power law scaling controlled by a single exponent, simply related to the FRACTAL

dimension or Hurst exponent (H) of the process. MultiFRACTAL processes, like

turbulence, have more complex behaviours defined by a spectrum of possible local

scaling behaviours or singularity exponents (h). Here, we report two experiments

that explore the relationships between instrumental and cognitive variables and

the monoFRACTAL and multiFRACTAL parameters of functional magnetic resonance

imaging (fMRI) data acquired in a no-task or resting state. First, we show that

the Hurst exponent is greater in grey matter than in white matter regions, and it

is maximal in grey matter when data were acquired with an echo time known to

optimise BOLD contrast. Second, we show that latency of response in a fame

decision/facial encoding task was negatively correlated with the Hurst exponent

of resting state data acquired 30 min after task performance. This association

was localised to a right inferior frontal cortical region activated by the fame

decision task and indicated that people with shorter response latency had more

persistent dynamics (higher values of H). MultiFRACTAL analysis revealed that

faster responding participants had wider singularity spectra of resting fMRI time

series in inferior frontal cortex. Endogenous brain oscillations measured by fMRI

have monoFRACTAL and multiFRACTAL properties that can be related to instrumental

and cognitive factors in a way, which indicates that these low frequency dynamics

are relevant to neurocognitive function.

(c) 2008 Wiley-Liss, Inc.

PMID: 18465788 [PubMed - indexed for MEDLINE]

152. Gynecol Obstet Invest. 2008;66(2):127-33. doi: 10.1159/000129671. Epub 2008 May


MultiFRACTAL description of the maternal surface of the placenta.

Kikuchi A(1), Unno N, Shiba M, Sunagawa S, Ogiso Y, Kozuma S, Taketani Y.

Author information:

(1)Department of Obstetrics, Center for Perinatal Medicine, Nagano Children's

Hospital, Nagano, Japan.

BACKGROUND: Recently, multiFRACTAL analysis based on generalized concepts of

FRACTALs has been applied to biological tissues composed of complex structures.

METHODS: Using digitized images of the maternal surface of 278 placentas,

multiFRACTAL parameters were measured with a FRACTAL analysis software.

RESULTS: The values of alpha(min), alpha(0), alpha(max) and the degree of

multiFRACTALity given by the alpha(max) - alpha(min) difference calculated from

278 placentas were 1.840 +/- 0.068, 2.089 +/- 0.034, 2.856 +/- 0.128 and 1.017

+/- 0.136, respectively. A significant decrease of alpha(min) and as a

consequence a significant increase in the degree of multiFRACTALity were observed

according to gestational age. The alpha(0) value of the placenta complicated by

pregnancy-induced hypertension (PIH) was significantly higher than that without

PIH. The alpha(min) and alpha(0) values of the placenta having intrauterine

growth restriction (IUGR) were significantly higher than those without IUGR. On

the other hand, the presence of chorioamnionitis did not change multiFRACTAL

properties of the maternal surface of the placenta.

CONCLUSION: The multiFRACTAL parameters may be objective indices of the

heterogeneity or complexity of the macroscopic morphology of the maternal surface

of the placenta. MultiFRACTAL analysis holds a promise for quantitatively

evaluating physiological and pathological development of the placenta.

(c) 2008 S. Karger AG, Basel.

PMID: 18463415 [PubMed - indexed for MEDLINE]

153. Med Phys. 2008 Feb;35(2):717-23.

MultiFRACTALity, sample entropy, and wavelet analyses for age-related changes in

the peripheral cardiovascular system: preliminary results.

Humeau A(1), Chapeau-Blondeau F, Rousseau D, Rousseau P, Trzepizur W, Abraham P.

Author information:

(1)Groupe esaip, 18 rue du 8 mai 1945, BP 80022, 49180 Saint Barthélémy d'Anjou,


Using signal processing measures we evaluate the effect of aging on the

peripheral cardiovascular system. Laser Doppler flowmetry (LDF) signals,

reflecting the microvascular perfusion, are recorded on the forearm of 27 healthy

subjects between 20-30, 40-50, or 60-70 years old. Wavelet-based representations,

Hölder exponents, and sample entropy values are computed for each time series.

The results indicate a possible modification of the peripheral cardiovascular

system with aging. Thus, the endothelial-related metabolic activity decreases,

but not significantly, with aging. Furthermore, LDF signals are more monoFRACTAL

for elderly subjects than for young people for whom LDF signals are weakly

multiFRACTAL: the average range of Holder exponents computed with a parametric

generalized quadratic variation based estimation method is 0.13 for subjects

between 20 and 30 years old and 0.06 for subjects between 60 and 70 years old.

Moreover, the average mean sample entropy value of LDF signals slightly decreases

with age: it is 1.34 for subjects between 20 and 30 years old and 1.19 for

subjects between 60 and 70 years old. Our results could assist in gaining

knowledge on the relationship between microvascular system status and age and

could also lead to a more accurate age-related nonlinear modeling.

PMID: 18383693 [PubMed - indexed for MEDLINE]

154. Phys Rev E Stat Nonlin Soft Matter Phys. 2008 Feb;77(2 Pt 2):026111. Epub 2008

Feb 15.

MultiFRACTALity and randomness in the unstable plastic flow near the lower

strain-rate boundary of instability.

Lebyodkin MA(1), Lebedkina TA.

Author information:

(1)Laboratoire de Physique et Mécanique des Matériaux, UMR CNRS No 7554, Université

Paul Verlaine - Metz, Ile du Saulcy, 57045 Metz Cedex, France.

The unstable plastic flow of an AlMg alloy, associated with the Portevin-Le

Chatelier effect, was studied near the lower strain-rate boundary of instability

using multiFRACTAL analysis. Self-similarity of deformation curves, indicating

long-range time correlations of stress serrations, was detected within the

strain-rate range where serrations are commonly ascribed to the occurrence of

uncorrelated deformation bands. The deformation curves display a wide range of

shapes that are characterized by different groupings of serrations. MultiFRACTAL

analysis provides a method to quantify the observed complexity and compare it to

known Portevin-Le Chatelier effect regimes. The measurement noise effect on the

multiFRACTAL spectra determined from experimental data was mimicked by

superposing multiFRACTAL Cantor sets with random noise. Such tests using standard

multiFRACTAL data sets justify the separation of self-similar and random

components of the serrated deformation curves. Furthermore, these results shed

light on the general problem of the effect of experimental noise on the apparent

multiFRACTAL properties of physical FRACTALs.

PMID: 18352094 [PubMed]

155. J Theor Biol. 2008 Mar 7;251(1):60-7. Epub 2007 Sep 26.

Why the wrinkling transition in partially polymerized membranes is not universal?

FRACTAL-multiFRACTAL hierarchy.

Chaieb S(1), Málková S, Lal J.

Author information:

(1)Mechanical Science and Engineering Department, University of Illinois at

Urbana-Champaign, USA.

When partially polymerized membranes wrinkle they exhibit a passage from a

conventional buckling (due to an instability caused by chiral symmetry breaking)

at low polymerization to a local roughening (due to a frustration in the local

packing of the chiral molecules composing the membrane) as a function of the

polymerization of the lipids aliphatic tails. This transition was found to be

non-universal and here we used neutron scattering to elucidate that this behavior

is due to the onset of stretching in the membrane accompanied by a bilayer

thickness variation. Close to the percolation limit this deformation is plastic

similar to mutated lysozymes. We draw an analogy between this transition and

echinocytes in red blood cells.

PMID: 18083197 [PubMed - indexed for MEDLINE]

156. Biophys J. 2007 Dec 15;93(12):L59-61.

MultiFRACTALity in the peripheral cardiovascular system from pointwise holder

exponents of laser Doppler flowmetry signals.

Humeau A(1), Chapeau-Blondeau F, Rousseau D, Tartas M, Fromy B, Abraham P.

Author information:

(1)Groupe ISAIP-ESAIP, Saint Barthélémy d'Anjou, France.

We study the dynamics of skin laser Doppler flowmetry signals giving a peripheral

view of the cardiovascular system. The analysis of Hölder exponents reveals that

the experimental signals are weakly multiFRACTAL for young healthy subjects at

rest. We implement the same analysis on data generated by a standard theoretical

model of the cardiovascular system based on nonlinear coupled oscillators with

linear couplings and fluctuations. We show that the theoretical model, although

it captures basic features of the dynamics, is not complex enough to reflect the

multiFRACTAL irregularities of microvascular mechanisms.

PMCID: PMC2098720

PMID: 18045964 [PubMed - indexed for MEDLINE]

157. Conf Proc IEEE Eng Med Biol Soc. 2007;2007:5543-6.

Multidimensional models for methodological validation in multiFRACTAL analysis.

Lopes R(1), Dubois P, Bhouri I, Puech P, Maouche S, Betrouni N.

Author information:

(1)Inserm, U703, Lille, France.

MultiFRACTAL analysis is known as a useful tool in signal analysis. However

methods are often used without methodological validation. In this study, we

define multidimensional models in order to validate multiFRACTAL analysis


PMID: 18003268 [PubMed - indexed for MEDLINE]

158. Conf Proc IEEE Eng Med Biol Soc. 2007;2007:5035-8.

Local-scale analysis of cardiovascular signals by detrended fluctuations

analysis: effects of posture and exercise.

Castiglioni P(1), Quintin L, Civijian A, Parati G, Di Rienzo M.

Author information:

(1)Polo Tecnologico (Biomedical Technology Department), IRCCS S. Maria Nascente,

Fondazione Don Gnocchi Onlus, Milan, Italy.

The FRACTAL structure of heart rate is usually quantified by estimating a

short-term (alpha(1)) and a long-term (alpha(2)) scaling exponent by Detrended

Fluctuations Analysis (DFA). Evidence, however, has been provided that heart rate

is a multiFRACTAL signal, better characterized by a large number of scaling

exponents. Aim of this study is to verify whether two scaling exponents only from

DFA provide a sufficiently accurate description of the possibly multiFRACTAL

nature of cardiovascular signals. We measured ECG and finger arterial pressure in

33 volunteers for 10 minutes during each of 3 conditions: supine rest (SUP);

sitting at rest (SIT); light physical exercise (EXE). DFA was applied on the

beat-by-beat series of R-R interval (RRI) and mean arterial pressure (MAP). We

then computed the local scaling exponent alpha(n), defined as the slope of the

detrended fluctuation function F(n) around the beat scale n, in a log-log plot.

If alpha(1) and alpha(2) correctly model the multiscale structure of blood

pressure and heart rate, we should find that alpha(n) is constant over a

short-term and a longterm range of beat scales. Results show that only the

long-term alpha2 exponent provides a relatively good approximation of the

multiscale structure of RRI and MAP. Moreover, posture and physical activity have

important effects on local scaling exponents, and on the range of beat scales n

where alpha(n) can be approximated by a constant alpha2 coefficient.

PMID: 18003137 [PubMed - indexed for MEDLINE]

159. Brain Res. 2007 Dec;1186:113-23. Epub 2007 Oct 16.

Neuronal response to Shepard's tones: an auditory fMRI study using multiFRACTAL


Shimizu Y(1), Umeda M, Mano H, Aoki I, Higuchi T, Tanaka C.

Author information:

(1)Department of Neurosurgery, Meiji University of Oriental Medicine, Hiyoshi-cho,

Funai-gun, Kyoto, Japan.

Shepard's tones are a typical example for auditory illusion. They consist in a

series of computer generated tones, which prohibit relative pitch discrimination.

As a result, when repetitively played in sequence, the illusion of an

ever-ascending scale is evoked. In order to investigate this aural phenomenon,

fMRI time series were acquired during presentation of a conventional

block-designed paradigm as well as during continuous presentation of Shepard's

tones. With respect to the different setups of the two experiments, two

fundamentally different methods were applied in order to conduct data analysis.

Common Statistical Parameter Mapping served to evaluate the time series obtained

with the block-designed paradigm. For the continuous experiment, a novel

wavelet-based multiFRACTAL analysis was used, recently proposed as a

classification tool for fMRI time series. This approach applies the wavelet

transform to extract multiFRACTAL spectra from time-signals. For reasons of

quantification, we introduced an ameliorated method for visual inspection of the

multiFRACTAL properties. The results proved existence of characteristic neural

responses to continuously presented Shepard's tones. Interestingly, the same was

not restricted to the auditory cortex, but also involved areas of the visual

cortex. Related impact on the imaged cognitive areas, primary motor cortex, and

primary sensory cortex could not be observed. We further provide evidence that

pitch misjudgment does not occur in temporal concurrence with the repetition of

the whole scale, but according to whether the main perceived frequency is located

in the sensitive range of auditory perception or not. We remark that this is the

first time, continuously stimulated brain areas could be detected by means of


PMID: 17999926 [PubMed - indexed for MEDLINE]

160. Phys Rev E Stat Nonlin Soft Matter Phys. 2007 Oct;76(4 Pt 2):046612. Epub 2007

Oct 29.

Soliton FRACTALs in the Korteweg-de Vries equation.

Zamora-Sillero E(1), Shapovalov AV.

Author information:

(1)Departamento de Fisica Aplicada I, Escuela Universitaria Politecnica. Universidad

de Sevilla Virgen de Africa 7, 41011 Sevilla, Spain.

We have studied the process of creation of solitons and generation of FRACTAL

structures in the Korteweg-de Vries (KdV) equation when the relation between the

nonlinearity and dispersion is abruptly changed. We observed that when this

relation is changed nonadiabatically the solitary waves present in the system

lose their stability and split up into ones that are stable for the set of

parameters. When this process is successively repeated the trajectories of the

solitary waves create a FRACTAL treelike structure where each branch bifurcates

into others. This structure is formed until the iteration where two solitary

waves overlap just before the breakup. By means of a method based on the inverse

scattering transformation, we have obtained analytical results that predict and

control the number, amplitude, and velocity of the solitary waves that arise in

the system after every change in the relation between the dispersion and the

nonlinearity. This complete analytical information allows us to define a

recursive L system which coincides with the treelike structure, governed by KdV,

until the stage when the solitons start to overlap and is used to calculate the

Hausdorff dimension and the multiFRACTAL properties of the set formed by the

segments defined by each of the two "brothers" solitons before every breakup.

PMID: 17995132 [PubMed]

161. Phys Rev E Stat Nonlin Soft Matter Phys. 2007 Oct;76(4 Pt 1):041910. Epub 2007

Oct 19.

MultiFRACTALity and scale invariance in human heartbeat dynamics.

Ching ES(1), Tsang YK.

Author information:

(1)Department of Physics and Institute of Theoretical Physics, The Chinese

University of Hong Kong, Shatin, Hong Kong.

Human heart rate is known to display complex fluctuations. Evidence of

multiFRACTALity in heart rate fluctuations in healthy state has been reported

[Ivanov, Nature (London) 399, 461 (1999)]. This multiFRACTAL character could be

manifested as the dependence of the probability density functions (PDFs) of the

interbeat interval increments, which are the differences in two interbeat

intervals that are separated by n beats, on n . On the other hand, "scale

invariance in the PDFs of detrended healthy human heart rate increments" was

recently reported [Kiyono, Phys. Rev. Lett. 93, 178103 (2004)]. In this paper, we

clarify that the scale invariance reported is actually exhibited by the PDFs of

the increments of the "detrended" integrated healthy interbeat interval and

should, therefore, be more accurately referred as the scale invariance or n

independence of the PDFs of the sum of n detrended interbeat intervals. Indeed,

we demonstrate explicitly that the PDFs of detrended healthy interbeat interval

increments are scale or n dependent in accord with its multiFRACTAL character.

Our work also establishes that this n independence of the PDFs of the sum of n

detrended interbeat intervals is a general feature of human heartbeat dynamics,

shared by heart rate fluctuations in both healthy and pathological states.

PMID: 17995029 [PubMed - indexed for MEDLINE]

162. Nord J Psychiatry. 2007;61(5):339-42.

MultiFRACTAL analysis as an aid in the diagnostics of mental disorders.

Slezin VB(1), Korsakova EA, Dytjatkovsky MA, Schultz EA, Arystova TA, Siivola JR.

Author information:

(1)Bekhterevs Psychoneurological Research Institute, St Petersburg, Russia.

The digitalization of EEGs (electroencephalogram) has showed new possibilities

for analyzing electrical activity of brain. This has offered new methods, e.g.

multiFRACTAL analysis of 1/f(beta) EEG rhythms fluctuations. It is one of highly

mathematical methods feasible in routine practice now that modern personal

computers (PCs) have reached sufficient computing power. In this study, we

applied the multiFRACTAL analysis of 1/f(beta) EEG rhythms fluctuations in 33

patients suffering from schizophrenia and schizophrenia-like syndromes, and we

had 23 healthy controls. Our results indicated that the patients suffering from

schizophrenia have statistically different values compared with the controls.

This method is rather easy and quick to perform when using a standard PC. It may

have the potential to become an important tool in the diagnostics and analysis of

the patients with schizophrenia and schizophreniformic psychoses. It can help to

understand the quasi-chaotic processes in neural processing and narrow the gap

between the phenomenological psychiatry and bio-psychiatry.

PMID: 17990194 [PubMed - indexed for MEDLINE]

163. Conf Proc IEEE Eng Med Biol Soc. 2006;1:1450-3.

Further study of the asymmetry for multiFRACTAL spectra of heartbeat time series.

Muñoz-Diosdado A(1), Del Río-Correa JL.

Author information:

(1)Department of Mathematics, Unidad Profesional Interdisciplinaria de

Biotecnología, Instituto Politécnico Nacional, México, Col. Barrio la Laguna

Ticomán, 07340, México, DF.

We study the asymmetry of multiFRACTAL spectra of diurnal heartbeat time series

from healthy young subjects, healthy elderly subjects and patients with

congestive heart failure (CHF). Aging and CHF causes loss of multiFRACTALity. We

report here some ways of analyzing the asymmetry of these spectra and we show how

the joint analysis of the degree of multiFRACTALity and the parameters that

characterizes the asymmetry can differentiate between the cardiac interbeat time

series of young and elderly persons and it can also separate healthy subjects and

CHF patients.

PMID: 17946464 [PubMed - indexed for MEDLINE]

164. Phys Rev E Stat Nonlin Soft Matter Phys. 2007 Sep;76(3 Pt 2):036705. Epub 2007

Sep 14.

FRACTAL geometry in an expanding, one-dimensional, Newtonian universe.

Miller BN(1), Rouet JL, Le Guirriec E.

Author information:

(1)Department of Physics and Astronomy, Texas Christian University, Fort Worth,

Texas 76129, USA.

Observations of galaxies over large distances reveal the possibility of a FRACTAL

distribution of their positions. The source of FRACTAL behavior is the lack of a

length scale in the two body gravitational interaction. However, even with new,

larger, sample sizes from recent surveys, it is difficult to extract information

concerning FRACTAL properties with confidence. Similarly, three-dimensional

N-body simulations with a billion particles only provide a thousand particles per

dimension, far too small for accurate conclusions. With one-dimensional models

these limitations can be overcome by carrying out simulations with on the order

of a quarter of a million particles without compromising the computation of the

gravitational force. Here the multiFRACTAL properties of two of these models that

incorporate different features of the dynamical equations governing the evolution

of a matter dominated universe are compared. For each model at least two scaling

regions are identified. By employing criteria from dynamical systems theory it is

shown that only one of them can be geometrically significant. The results share

important similarities with galaxy observations, such as hierarchical clustering

and apparent biFRACTAL geometry. They also provide insights concerning possible

constraints on length and time scales for FRACTAL structure. They clearly

demonstrate that FRACTAL geometry evolves in the mu (position, velocity) space.

The observed patterns are simply a shadow (projection) of higher-dimensional


PMID: 17930359 [PubMed]

165. Sheng Wu Yi Xue Gong Cheng Xue Za Zhi. 2007 Jun;24(3):522-5.

[MultiFRACTAL analysis of genomes sequences' CGR graph].

[Article in Chinese]

Fu W(1), Wang Y, Lu D.

Author information:

(1)Department of Physics, Fudan University, Shanghai 200433, China.

To describe the FRACTAL feature of CGR (Chaos-game representation) graph of

genomes sequences, a multiFRACTAL theory is presented in the analysis. By

studying the effect of three probability sets on the scale invariance range, the

probability set with the best scale invariance is chosen, and then the smooth

general dimension spectrum and multiFRACTAL spectrum are calculated. The

experimental result shows that the probability set composed of the relative

probability has the best scale-invariance performance. The scale invariance has

three different variance regions, which indicate that genomes sequence segments

with different lengths have different distribution rules. It is concluded that

the multiFRACTAL method is effective for describing the FRACTAL feature of CGR

graph of genomes sequences.

PMID: 17713253 [PubMed - indexed for MEDLINE]

166. Philos Trans A Math Phys Eng Sci. 2008 Feb 13;366(1864):345-57.

A complex biological system: the fly's visual module.

Baptista MS(1), de Almeida LO, Slaets JF, Köberle R, Grebogi C.

Author information:

(1)Institut für Physik Am Neuen Palais 10, Universität Potsdam, 14469 Potsdam,


Is the characterization of biological systems as complex systems in the

mathematical sense a fruitful assertion? In this paper we argue in the

affirmative, although obviously we do not attempt to confront all the issues

raised by this question. We use the fly's visual system as an example and analyse

our experimental results of one particular neuron in the fly's visual system from

this point of view. We find that the motion-sensitive 'H1' neuron, which converts

incoming signals into a sequence of identical pulses or 'spikes', encodes the

information contained in the stimulus into an alphabet composed of a few letters.

This encoding occurs on multilayered sets, one of the features attributed to

complex systems. The conversion of intervals between consecutive occurrences of

spikes into an alphabet requires us to construct a generating partition. This

entails a one-to-one correspondence between sequences of spike intervals and

words written in the alphabet. The alphabet dynamics is multiFRACTAL both with

and without stimulus, though the multiFRACTALity increases with the stimulus

entropy. This is in sharp contrast to models generating independent spike

intervals, such as models using Poisson statistics, whose dynamics is

monoFRACTAL. We embed the support of the probability measure, which describes the

distribution of words written in this alphabet, in a two-dimensional space, whose

topology can be reproduced by an M-shaped map. This map has positive Lyapunov

exponents, indicating a chaotic-like encoding.

PMID: 17673416 [PubMed - indexed for MEDLINE]

167. Ann Noninvasive Electrocardiol. 2007 Apr;12(2):130-6.

The "Chaos Theory" and nonlinear dynamics in heart rate variability analysis:

does it work in short-time series in patients with coronary heart disease?

Krstacic G(1), Krstacic A, Smalcelj A, Milicic D, Jembrek-Gostovic M.

Author information:

(1)Institute for Cardiovascular Diseases and Rehabilitation, Zagreb, Croatia.

BACKGROUND: Dynamic analysis techniques may quantify abnormalities in heart rate

variability (HRV) based on nonlinear and FRACTAL analysis (chaos theory). The

article emphasizes clinical and prognostic significance of dynamic changes in

short-time series applied on patients with coronary heart disease (CHD) during

the exercise electrocardiograph (ECG) test.

METHODS: The subjects were included in the series after complete cardiovascular

diagnostic data. Series of R-R and ST-T intervals were obtained from exercise ECG

data after sampling digitally. The range rescaled analysis method determined the

FRACTAL dimension of the intervals. To quantify FRACTAL long-range correlation's

properties of heart rate variability, the detrended fluctuation analysis

technique was used. Approximate entropy (ApEn) was applied to quantify the

regularity and complexity of time series, as well as unpredictability of

fluctuations in time series.

RESULTS: It was found that the short-term FRACTAL scaling exponent (alpha(1)) is

significantly lower in patients with CHD (0.93 +/- 0.07 vs 1.09 +/- 0.04; P <

0.001). The patients with CHD had higher FRACTAL dimension in each exercise test

program separately, as well as in exercise program at all. ApEn was significant

lower in CHD group in both RR and ST-T ECG intervals (P < 0.001).

CONCLUSIONS: The nonlinear dynamic methods could have clinical and prognostic

applicability also in short-time ECG series. Dynamic analysis based on chaos

theory during the exercise ECG test point out the multiFRACTAL time series in CHD

patients who loss normal FRACTAL characteristics and regularity in HRV. Nonlinear

analysis technique may complement traditional ECG analysis.

PMID: 17593181 [PubMed - indexed for MEDLINE]

168. Phys Rev E Stat Nonlin Soft Matter Phys. 2007 Mar;75(3 Pt 1):032902. Epub 2007

Mar 12.

BiFRACTALity of human DNA strand-asymmetry profiles results from transcription.

Nicolay S(1), Brodie Of Brodie EB, Touchon M, Audit B, d'Aubenton-Carafa Y,

Thermes C, Arneodo A.

Author information:

(1)Laboratoire Joliot-Curie and Laboratoire de Physique, UMR 5672, CNRS, ENS-Lyon,

46 Allée d'Italie, 69364 Lyon Cedex 07, France.

We use the wavelet transform modulus maxima method to investigate the

multiFRACTAL properties of strand-asymmetry DNA walk profiles in the human

genome. This study reveals the biFRACTAL nature of these profiles, which involve

two competing scale-invariant (up to repeat-masked distances less, or similar 40

kbp) components characterized by Hölder exponents h{1}=0.78 and h{2}=1,

respectively. The former corresponds to the long-range-correlated homogeneous

fluctuations previously observed in DNA walks generated with structural codings.

The latter is associated with the presence of jumps in the original

strand-asymmetry noisy signal S. We show that a majority of upward (downward)

jumps co-locate with gene transcription start (end) sites. Here 7228 human gene

transcription start sites from the refGene database are found within 2 kbp from

an upward jump of amplitude DeltaS > or = 0.1 which suggests that about 36% of

annotated human genes present significant transcription-induced strand asymmetry

and very likely high expression rate.

PMID: 17500744 [PubMed - indexed for MEDLINE]

169. Acad Radiol. 2007 May;14(5):513-21.

FRACTAL analysis of mammographic parenchymal patterns in breast cancer risk


Li H(1), Giger ML, Olopade OI, Lan L.

Author information:

(1)Department of Radiology, The University of Chicago, 5841 S. Maryland Avenue,

Chicago, IL 60637, USA.

Comment in

Acad Radiol. 2007 May;14(5):511-2.

RATIONALE AND OBJECTIVES: To evaluate FRACTAL-based computerized image analyses

of mammographic parenchymal patterns in the task of differentiating between women

at high risk and women at low risk for developing breast cancer.

MATERIALS AND METHODS: The FRACTAL-based texture analyses are based on a

box-counting method and a Minkowski dimension, and were performed within the

parenchymal regions of normal mammograms. Four approaches were evaluated: 1) a

conventional box-counting method, 2) a modified box-counting technique using

linear discriminant analysis (LDA), 3) a global Minkowski dimension, and 4) a

modified Minkowski technique using LDA. These FRACTAL based texture features were

extracted from regions of interest to assess the mammographic parenchymal

patterns of the images. Receiver operating characteristic analysis was used to

evaluate the performance of these features in the task of differentiating between

the two groups of women.

RESULTS: Receiver operating characteristic analysis yielded an A(z) value of 0.74

based on the conventional box-counting technique and an A(z) value of 0.84 based

on the global Minkowski dimension in the task of distinguishing between the two

groups. By using LDA to assess the characteristics of mammograms, A(z) values of

0.90 and 0.93 were obtained in differentiating the two groups, for the modified

box-counting and Minkowski techniques, respectively. Statistically significant

improvement was achieved (P < .05) with the new techniques compared to the

conventional FRACTAL analysis methods. A simulation study, which used the slope

and intercept extracted from the least square fit of the experimental data with

the LDA approaches, yielded A(z) values similar to those obtained with the

conventional approaches in the task of differentiating between the two groups.

CONCLUSIONS: The proposed LDA approach improved significantly the separation

between the two groups based on experimental data. Because this approach was used

as a linear classifier rather than as a regression function, it combined the

FRACTAL analysis with the knowledge of the high- and low-risk patterns, and thus

better characterized the multiFRACTAL nature of the parenchymal patterns. We

believe that the proposed analyses based on the LDA technique to characterize

mammographic parenchymal patterns may potentially yield radiographic markers for

assessing breast cancer risk.

PMID: 17434064 [PubMed - indexed for MEDLINE]

170. Conf Proc IEEE Eng Med Biol Soc. 2005;5:4783-6.

MultiFRACTAL Analysis of Genomic Sequences CGR Images.

Fu W(1), Wang Y, Lu D.

Author information:

(1)Department of Electronic Engineering, Fudan University, Shanghai 200433, China.

To describe the FRACTAL feature of Chaos Game Representation (CGR) images of

genomic sequences, a multiFRACTAL theory is presented in the analysis. With the

probability set of CGR images, the general dimension spectrum and the

multiFRACTAL spectrum are calculated and compared between two sample groups of

gene thick sequences and gene black sequences. The experimental result shows that

the probability set composed of the relative probability has the best

scale-invariance performance. The scale invariance has different variance

regions, which indicates genomic sequence segments with different lengths have

different distribution rules. It is also shown that the sequences, spectra for

two groups are different, specially the attenuation index and value range of the

general dimension spectrum, and the width of the multiFRACTAL spectrum. It is

concluded that the mutiFRACTAL analysis and its parameters may be useful to

analyze sequences's statistic property and recognize gene sequence.

PMID: 17281311 [PubMed]

171. Phys Rev E Stat Nonlin Soft Matter Phys. 2006 Dec;74(6 Pt 1):061110. Epub 2006

Dec 12.

Application of the microcanonical multiFRACTAL formalism to monoFRACTAL systems.

Pont O(1), Turiel A, Pérez-Vicente CJ.

Author information:

(1)Departament de Física Fonamental, Universitat de Barcelona, Diagonal, 647, 08028

Barcelona, Spain.

The design of appropriate multiFRACTAL analysis algorithms, able to correctly

characterize the scaling properties of multiFRACTAL systems from experimental,

discretized data, is a major challenge in the study of such scale invariant

systems. In the recent years, a growing interest for the application of the

microcanonical formalism has taken place, as it allows a precise localization of

the FRACTAL components as well as a statistical characterization of the system.

In this paper, we deal with the specific problems arising when systems that are

strictly monoFRACTAL are analyzed using some standard microcanonical multiFRACTAL

methods. We discuss the adaptations of these methods needed to give an

appropriate treatment of monoFRACTAL systems.

PMID: 17280041 [PubMed]

172. Phys Rev E Stat Nonlin Soft Matter Phys. 2006 Dec;74(6 Pt 1):061104. Epub 2006

Dec 7.

Detrended fluctuation analysis for FRACTALs and multiFRACTALs in higher


Gu GF(1), Zhou WX.

Author information:

(1)School of Business, East China University of Science and Technology, Shanghai

200237, China.

One-dimensional detrended fluctuation analysis (DFA) and multiFRACTAL detrended

fluctuation analysis (MFDFA) are widely used in the scaling analysis of FRACTAL

and multiFRACTAL time series because they are accurate and easy to implement. In

this paper we generalize the one-dimensional DFA and MFDFA to higher-dimensional

versions. The generalization works well when tested with synthetic surfaces

including fractional Brownian surfaces and multiFRACTAL surfaces. The

two-dimensional MFDFA is also adopted to analyze two images from nature and

experiment, and nice scaling laws are unraveled.

PMID: 17280035 [PubMed]

173. IEEE Trans Biomed Eng. 2006 Oct;53(10):1920-5.

MultiFRACTAL ECG mapping of ventricular epicardium during regional ischemia in

the pig.

Chen Y(1), Nash MP, Ning X, Wang Y, Paterson DJ, Wang J.

Author information:

(1)Department of Electronic Science and Engineering, Nanjing University, Nanjing CO

1051, China.

Myocardial ischemia creates abnormal electrophysiological substrates that can

result in life-threatening ventricular arrhythmias. Early clinical identification

of ischemia in patients is important to managing their condition. We analyzed

electrograms from an ischemia-reperfusion animal model in order to investigate

the relationship between myocardial ischemia and variability of electrocardiogram

(ECG) multiFRACTALity. Ventricular epicardial electropotential maps from the

anesthetized pig during LAD ischemia-reperfusion were analyzed using multiFRACTAL

methods. A new parameter called the singularity spectrum area reference

dispersion (SARD) is presented to represent the temporal evolution of

multiFRACTALity. By contrasting the ventricular epicardial SARD and range of

singularity strength (delta alpha) maps against activation-recovery interval

(ARI) maps, we found that the dispersions of SARD and dleta alpha increased

following the onset of ischemia and decreased with tissue recovery. In addition,

steep spatial gradients of SARD and delta alpha corresponded to locations of

ischemia, although the distribution of multiFRACTALity did not reflect the degree

of myocardial ischemia. However, the multiFRACTALity of the ventricular

epicardial electrograms was useful for classifying the recoverability of ischemic

tissue. Myocardial ischemia significantly influenced the multiFRACTALity of

ventricular electrical activity. Recoverability of ischemic myocardium can be

classified using the multiFRACTALity of ventricular epicardial electrograms. The

location and size of regions of severe ischemic myocardium with poor

recoverability is detectable using these methods.

PMID: 17019855 [PubMed - indexed for MEDLINE]

174. Chaos. 2006 Sep;16(3):033129.

Self-organized critical gating of ion channels: on the origin of long-term memory

in dwell time series.

Brazhe AR(1), Maksimov GV.

Author information:

(1)Biophysics Department, Faculty of Biology, Moscow State University, Moscow,


We present the model of an ion channel, operating in a regime of self-organized

criticality. It is suggested that complex cooperative dynamics takes place in the

protein and the overall tension of it facilitates an open or closed state of the

rigid gates in the pore-making domain. For the first time multiFRACTAL spectra of

ion channel dynamics are presented. Our model well reproduces the multiFRACTAL

properties of ion channel dwell time series and provides an insight on the origin

of the long-term correlations in these series.

PMID: 17014234 [PubMed - indexed for MEDLINE]

175. J Neurosurg Anesthesiol. 2006 Oct;18(4):223-9.

Identification of patients with childhood moyamoya diseases showing temporary

hypertension after anesthesia by preoperative multiFRACTAL Hurst analysis of

heart rate variability.

Yum MK(1), Oh AY, Lee HM, Kim CS, Kim SD, Lee YS, Wang KC, Chung YN, Kim HS.

Author information:

(1)Department of Pediatrics, College of Medicine, Hanyang University, Korea.

OBJECTIVE: This study was performed to determine whether the preoperative

multiFRACTAL Hurst analysis of heart rate variability might identify and

characterize childhood patients with moyamoya disease (MMD) who showed temporary

postoperative hypertension.

METHODS: We studied 59 childhood patients with MMD. Thirty were classified as

hypertensive group when the mean arterial pressure in the postoperative recovery

room was 120% or greater than that during the preoperative period and 29 were

classified as normotensive group. The 2 groups were compared with respect to

preoperative indices of heart rate variability including frequency-domain

measures, approximate entropy, and very short-term multiFRACTAL Hurst exponents

of RR intervals (RRI). Using preoperative indices that showed significant

differences, discriminant analysis was performed to identify postoperative

hypertensive patients.

RESULTS: Only exponents of the order > or =3 (H3alpha, H4alpha, and H5alpha) were

significantly lower in the hypertensive group than in the normotensive group.

Frequency-domain measures, approximate entropy, and the exponents of the order <

or =2 were not significantly different in the 2 groups. Discriminant analysis

using all of the three exponents correctly identified 27/30 (90%) of the

postoperative hypertensive patients.

CONCLUSIONS: Preoperative very short-term multiFRACTAL Hurst analysis of RRI

variability identified 90% of childhood MMD patients who developed postoperative

hypertension. The preoperative characteristic of RRI variability was the reduced

smoothness at the 8-second-long, local RRI regions within which a very large

change of RRI occurs.

PMID: 17006118 [PubMed - indexed for MEDLINE]

176. Phys Rev E Stat Nonlin Soft Matter Phys. 2006 Jul;74(1 Pt 2):016103. Epub 2006

Jul 6.

Wavelet versus detrended fluctuation analysis of multiFRACTAL structures.

Oświecimka P(1), Kwapień J, Drozdz S.

Author information:

(1)Institute of Nuclear Physics, Polish Academy of Sciences, Kraków, Poland.

We perform a comparative study of applicability of the multiFRACTAL detrended

fluctuation analysis (MFDFA) and the wavelet transform modulus maxima (WTMM)

method in proper detecting of monoFRACTAL and multiFRACTAL character of data. We

quantify the performance of both methods by using different sorts of artificial

signals generated according to a few well-known exactly soluble mathematical

models: monoFRACTAL fractional Brownian motion, biFRACTAL Lévy flights, and

different sorts of multiFRACTAL binomial cascades. Our results show that in the

majority of situations in which one does not know a priori the FRACTAL properties

of a process, choosing MFDFA should be recommended. In particular, WTMM gives

biased outcomes for the fractional Brownian motion with different values of Hurst

exponent, indicating spurious multiFRACTALity. In some cases WTMM can also give

different results if one applies different wavelets. We do not exclude using WTMM

in real data analysis, but it occurs that while one may apply MFDFA in a more

automatic fashion, WTMM must be applied with care. In the second part of our

work, we perform an analogous analysis on empirical data coming from the American

and from the German stock market. For this data both methods detect rich

multiFRACTALity in terms of broad f(alpha), but MFDFA suggests that this

multiFRACTALity is poorer than in the case of WTMM.

PMID: 16907147 [PubMed]

177. Phys Rev E Stat Nonlin Soft Matter Phys. 2006 Jun;73(6 Pt 2):066125. Epub 2006

Jun 26.

Inhomogeneous sandpile model: Crossover from multiFRACTAL scaling to finite-size


Cernák J.

Author information:

Department of Biophysics, University of P. J. Safárik in Kosice, Jesenná 5,

SK-04000 Kosice, Slovak Republic.

We study an inhomogeneous sandpile model in which two different toppling rules

are defined. For any site only one rule is applied corresponding to either the

Bak, Tang, and Wiesenfeld model [P. Bak, C. Tang, and K. Wiesenfeld, Phys. Rev.

Lett. 59, 381 (1987)] or the Manna two-state sandpile model [S. S. Manna, J.

Phys. A 24, L363 (1991)]. A parameter c is introduced which describes a density

of sites which are randomly deployed and where the stochastic Manna rules are

applied. The results show that the avalanche area exponent tau a, avalanche size

exponent tau s, and capacity FRACTAL dimension Ds depend on the density c. A

crossover from multiFRACTAL scaling of the Bak, Tang, and Wiesenfeld model (c =

0) to finite-size scaling was found. The critical density c is found to be in the

interval 0 < c < 0.01. These results demonstrate that local dynamical rules are

important and can change the global properties of the model.

PMID: 16906932 [PubMed]

178. Phys Rev E Stat Nonlin Soft Matter Phys. 2006 Jun;73(6 Pt 2):066102. Epub 2006

Jun 1.

Statistical self-similar properties of complex networks.

Lee CY(1), Jung S.

Author information:

(1)The Department of Industrial Information, Kongju National University, Chungnam,

340-702 South Korea.

It has been shown that many complex networks shared distinctive features, which

differ in many ways from the random and the regular networks. Although these

features capture important characteristics of complex networks, their

applicability depends on the type of networks. To unravel ubiquitous

characteristics that complex networks may have in common, we adopt the clustering

coefficient as the probability measure, and present a systematic analysis of

various types of complex networks from the perspective of statistical

self-similarity. We find that the probability distribution of the clustering

coefficient is best characterized by the multiFRACTAL; moreover, the support of

the measure had a FRACTAL dimension. These two features enable us to describe

complex networks in a unified way; at the same time, offer unforeseen

possibilities to comprehend complex networks.

PMID: 16906909 [PubMed]

179. IEEE Trans Med Imaging. 2006 Aug;25(8):1101-7.

MultiFRACTAL analysis of human retinal vessels.

Stosić T(1), Stosić BD.

Author information:

(1)Departamento de Estatísica e Informática, Universidade Federal Rural de

Pernambuco, Dois Irmaos, Recife-PE, Brazil.

In this paper, it is shown that vascular structures of the human retina represent

geometrical multiFRACTALs, characterized by a hierarchy of exponents rather then

a single FRACTAL dimension. A number of retinal images from the STARE database

are analyzed, corresponding to both normal and pathological states of the retina.

In all studied cases, a clearly multiFRACTAL behavior is observed, where capacity

dimension is always found to be larger then the information dimension, which is

in turn always larger then the correlation dimension, all the three being

significantly lower then the diffusion limited aggregation (DLA) FRACTAL

dimension. We also observe a tendency of images corresponding to the pathological

states of the retina to have lower generalized dimensions and a shifted spectrum

range, in comparison with the normal cases.

PMID: 16895002 [PubMed - indexed for MEDLINE]

180. Neuroimage. 2006 Sep;32(3):1158-66. Epub 2006 Jul 11.

MultiFRACTAL analysis of deep white matter microstructural changes on MRI in

relation to early-stage atherosclerosis.

Takahashi T(1), Murata T, Narita K, Hamada T, Kosaka H, Omori M, Takahashi K,

Kimura H, Yoshida H, Wada Y.

Author information:

(1)Department of Neuropsychiatry, Faculty of Medical Sciences, University of Fukui,

23-3 Shimoaizuki, Matsuoka-cho, Yoshida-gun, Fukui 910-1193, Japan.

MultiFRACTAL analysis based on generalized concepts of FRACTALs has been applied

to evaluate biological tissues composed of complex structures. This type of

analysis can provide a precise quantitative description of a broad range of

heterogeneous phenomena. Previously, we applied multiFRACTAL analysis to describe

heterogeneity in white matter signal fluctuation on T2-weighted MR images as a

new method of texture analysis and established Deltaalpha as the most suitable

index for evaluating white matter structural complexity (Takahashi et al. J.

Neurol. Sci., 2004; 225: 33-37). Considerable evidence suggests that

pathophysiological processes occurring in deep white matter regions may be partly

responsible for cognitive deterioration and dementia in elderly subjects. We

carried out a multiFRACTAL analysis in a group of 36 healthy elderly subjects who

showed no evidence of atherosclerotic risk factors to examine the microstructural

changes of the deep white matter on T2-weighted MR images. We also performed

conventional texture analysis, i.e., determined the standard deviation of signal

intensity divided by mean signal intensity (SD/MSI) for comparison with

multiFRACTAL analysis. Next, we examined the association between the findings of

these two types of texture analysis and the ultrasonographically measured

intima-media thickness (IMT) of the carotid arteries, a reliable indicator of

early carotid atherosclerosis. The severity of carotid IMT was positively

associated with Deltaalpha in the deep white matter region. In addition, this

association remained significant after excluding 12 subjects with visually

detectable deep white matter hyperintensities on MR images. However, there was no

significant association between the severity of carotid IMT and SD/MSI. These

results indicate the potential usefulness of applying multiFRACTAL analysis to

conventional MR images as a new approach to detect the microstructural changes of

apparently normal white matter during the early stages of atherosclerosis.

PMID: 16815037 [PubMed - indexed for MEDLINE]

181. Phys Rev E Stat Nonlin Soft Matter Phys. 2006 Mar;73(3 Pt 1):031920. Epub 2006

Mar 21.

Clustering of protein structures using hydrophobic free energy and solvent

accessibility of proteins.

Yu ZG(1), Anh VV, Lau KS, Zhou LQ.

Author information:

(1)Program in Statistics and Operations Research, Queensland University of

Technology, GPO Box 2434, Brisbane, Queensland 4001, Australia.

The hydrophobic free energy and solvent accessibility of amino acids are used to

study the relationship between the primary structure and structural

classification of large proteins. A measure representation and a Z curve

representation of protein sequences are proposed. FRACTAL analysis of the measure

and Z curve representations of proteins and multiFRACTAL analysis of their

hydrophobic free energy and solvent accessibility sequences indicate that the

protein sequences possess correlations and multiFRACTAL scaling. The parameters

from the FRACTAL and multiFRACTAL analyses on these sequences are used to

construct some parameter spaces. Each protein is represented by a point in these

spaces. A method is proposed to distinguish and cluster proteins from the alpha,

beta, alpha + beta, and alpha/beta structural classes in these parameter spaces.

Fisher's linear discriminant algorithm is used to give a quantitative assessment

of our clustering on the selected proteins. Numerical results indicate that the

discriminant accuracies are satisfactory. In particular, they reach 94.12% and

88.89% in separating proteins from {alpha, alpha + beta, alpha/beta} proteins in

a three-dimensional space.

PMID: 16605571 [PubMed - indexed for MEDLINE]

182. Hum Biol. 2005 Oct;77(5):577-617.

Identification of spatial genetic boundaries using a multiFRACTAL model in human

population genetics.

Xue F(1), Wang J, Hu P, Ma D, Liu J, Li G, Zhang L, Wu M, Sun G, Hou H.

Author information:

(1)Department of Epidemiology and Biostatistics, School of Public Health, Shandong

University China, No. 44 Wen-hua-xi-lu Road, Jinan City, Shandong 250012,

People's Republic of China.

There are two purposes in displaying spatial genetic structure. One is that a

visual representation of the variation of the genetic variable should be provided

in the contour map. The other is that spatial genetic structure should be

reflected by the patterns or the gradients with genetic boundaries in the map.

Nevertheless, most conventional interpolation methods, such as Cavalli-Sforza's

method in genography, inverse distance-weighted methods, and the Kriging

technique, focus only on the first primary purpose because of their arbitrary

thresholds marked on the maps. In this paper we present an application of the

contour area multiFRACTAL model (CAMM) to human population genetics. The method

enables the analysis of the geographic distribution of a genetic marker and

provides an insight into the spatial and geometric properties of obtained

patterns. Furthermore, the CAMM may overcome some of the limitations of other

interpolation techniques because no arbitrary thresholds are necessary in the

computation of genetic boundaries. The CAMM is built by establishing power law

relationships between the area A (> or =rho) in the contour map and the value p

itself after plotting these values on a log-log graph. A series of straight-line

segments can be fitted to the points on the log-log graph, each representing a

power law relationship between the area A (> or =rho) and the cutoff genetic

variable value for rho in a particular range. These straight-line segments can

yield a group of cutoff values, which can be identified as the genetic boundaries

that can classify the map of genetic variable into discrete genetic zones. These

genetic zones usually correspond to spatial genetic structure on the landscape.

To provide a better understanding of the interest in the CAMM approach, we

analyze the spatial genetic structures of three loci (ABO, HLA-A, and TPOX) in

China using the CAMM. Each synthetic principal component (SPC) contour map of the

three loci is created by using both Han and minority groups data together. These

contour maps all present an obvious geographic diversity, which gradually

increases from north to south, and show that the genetic differences among

populations in different districts of the same nationality are greater than those

among different nationalities of the same district. It is surprising to find that

both the value of p and the FRACTAL dimension alpha have a clear north to south

gradient for each locus, and the same clear boundary between southern and

northern Asians in each contour map is still seen in the zone of the Yangtze

River, although substantial population migrations have occurred because of war or

famine in the last 2,000 or 3,000 years. A clear genetic boundary between

Europeans and Asians in each contour map is still seen in northwestern China with

a small value of alpha, although the genetic gradient caused by gene flow between

Europeans and Asians has tended to show expansion from northwestern China. From

the three contour maps another interesting result can be found: The values of

alpha north of the Yangtze River are generally less than those south of the

Yangtze River. This indicates that the genetic differences among the populations

north of the Yangtze River are generally smaller than those in populations south

of the Yangtze River.

PMID: 16596942 [PubMed - indexed for MEDLINE]

183. J Math Biol. 2006 Jun;52(6):830-74. Epub 2006 Mar 6.

FRACTAL rigidity by enhanced sympatho-vagal antagonism in heartbeat interval

dynamics elicited by central application of corticotropin-releasing factor in


Meyer M(1), Stiedl O.

Author information:

(1)FRACTAL Physiology, Max Planck Institute for Experimental Medicine, 37075

Göttingen, Germany.

The dynamics of heartbeat interval fluctuations were studied in awake

unrestrained mice following intracerebroventricular application of the

neuropeptide corticotropin-releasing factor (CRF). The cardiac time series

derived from telemetric ECG monitoring were analyzed by non-parametric techniques

of nonlinear signal processing: delay-vector variance (DVV) analysis,

higher-order variability (HOV) analysis, empirical mode decomposition (EMD),

multiscale embedding-space decomposition (MESD), multiexponent multiFRACTAL

(MEMF) analysis. The analyses support the conjecture that cardiac dynamics of

normal control mice has both deterministic and stochastic elements, is

nonstationary, nonlinear, and exerts multiFRACTAL properties. Central application

of CRF results in bradycardia and increased variability of the beat-to-beat

fluctuations. The altered dynamical properties elicited by CRF reflect a

significant loss of intrinsic structural complexity of cardiac control which is

due to central neuroautonomic hyperexcitation, i.e., enhanced sympatho-vagal

antagonism. The change in dynamical complexity is characterized by an effect

referred to as FRACTAL rigidity, leading to a significant impairment of

adaptability to extrinsic challenges in a fluctuating environment. The impact of

dynamical neurocardiopathy as a major precipiting factor for the propensity of

cardiac arrhythmias or sudden cardiac death by unchecked central CRF release in

significant acute life events in man is critically discussed.

PMID: 16521022 [PubMed - indexed for MEDLINE]

184. IEEE Trans Image Process. 2006 Mar;15(3):614-23.

Morphology-based multiFRACTAL estimation for texture segmentation.

Xia Y(1), Feng D, Zhao R.

Author information:

(1)Center for Multimedia Signal Processing, Department of Electronic and Information

Engineering, Hong Kong Polytechnic University, Hong Kong, China.

MultiFRACTAL analysis is becoming more and more popular in image segmentation

community, in which the box-counting based multiFRACTAL dimension estimations are

most commonly used. However, in spite of its computational efficiency, the

regular partition scheme used by various box-counting methods intrinsically

produces less accurate results. In this paper, a novel multiFRACTAL estimation

algorithm based on mathematical morphology is proposed and a set of new

multiFRACTAL descriptors, namely the local morphological multiFRACTAL exponents

is defined to characterize the local scaling properties of textures. A series of

cubic structure elements and an iterative dilation scheme are utilized so that

the computational complexity of the morphological operations can be tremendously

reduced. Both the proposed algorithm and the box-counting based methods have been

applied to the segmentation of texture mosaics and real images. The comparison

results demonstrate that the morphological multiFRACTAL estimation can

differentiate texture images more effectively and provide more robust


PMID: 16519348 [PubMed - indexed for MEDLINE]

185. J Chem Phys. 2006 Feb 14;124(6):64706.

Diffusion-limited deposition with dipolar interactions: FRACTAL dimension and

multiFRACTAL structure.

Tasinkevych M(1), Tavares JM, de Los Santos F.

Author information:

(1)Max-Planck-Institut für Metallforschung, Germany.

Computer simulations are used to generate two-dimensional diffusion-limited

deposits of dipoles. The structure of these deposits is analyzed by measuring

some global quantities: the density of the deposit and the lateral correlation

function at a given height, the mean height of the upper surface for a given

number of deposited particles, and the interfacial width at a given height.

Evidences are given that the FRACTAL dimension of the deposits remains constant

as the deposition proceeds, independently of the dipolar strength. These same

deposits are used to obtain the growth probability measure through the Monte

Carlo techniques. It is found that the distribution of growth probabilities obeys

multiFRACTAL scaling, i.e., it can be analyzed in terms of its f(alpha)

multiFRACTAL spectrum. For low dipolar strengths, the f(alpha) spectrum is

similar to that of diffusion-limited aggregation. Our results suggest that for

increasing the dipolar strength both the minimal local growth exponent alpha(min)

and the information dimension D(1) decrease, while the FRACTAL dimension remains

the same.

PMID: 16483228 [PubMed]

186. IEEE Trans Biomed Eng. 2006 Jan;53(1):83-8.

Local holder exponent analysis of heart rate variability in preterm infants.

Nakamura T(1), Horio H, Chiba Y.

Author information:

(1)Division of Biophysical Engineering, Department of Systems and Human Science,

Graduate School of Engineering Science, Osaka University, Toyonaka, Japan.

Heart rate variability (HRV) displays scale-invariant FRACTAL properties. Recent

studies have revealed multiFRACTAL properties in the healthy human HRV, which

could be characterized by singularities with various strength of local Hölder

exponents embedded in HRV. In this paper, HRV time series from preterm infants,

whose autonomic nervous system undergoes dramatic development, were collected

longitudinally. Changes in FRACTALity/multiFRACTALity of those HRV time series as

the postmenstrual age were examined in order to see if they could quantify

development of the autonomic nervous system. Temporal structure of the

singularities at several representative time scales was also analyzed to show

that intersingular event intervals could be well described by "power law

distribution," and the singular events appeared with age-dependent long-range

correlation in its strength. Detailed analyses suggested that FRACTALity and

multiFRACTALity of HRV, respectively, could quantify the development of the

respiratory center and the parasympathetic nervous system in the preterm infants.

The results obtained in this study might be beneficial for detecting occurrences

of life threatening singular events such as big apnea in preterm infants.

PMID: 16402606 [PubMed - indexed for MEDLINE]

187. Phys Rev E Stat Nonlin Soft Matter Phys. 2005 Oct;72(4 Pt 2):046213. Epub 2005

Oct 20.

Formation of multiFRACTAL population patterns from reproductive growth and local


Ozik J(1), Hunt BR, Ott E.

Author information:

(1)Department of Physics and Institute for Research in Electronics and Applied

Physics, University of Maryland, College Park, Maryland 20742, USA.

We consider the general character of the spatial distribution of a population

that grows through reproduction and subsequent local resettlement of new

population members. We present several simple one- and two-dimensional point

placement models to illustrate possible generic behavior of these distributions.

We show, numerically and analytically, that these models all lead to multiFRACTAL

spatial distributions of population. Additionally, we make qualitative links

between our models and the example of the Earth at Night image, showing the

Earth's nighttime man-made light as seen from space. The Earth at Night data

suffer from saturation of the sensing photodetectors at high brightness

("clipping"), and we account for how this influences the determined dimension

spectrum of the light intensity distribution.

PMID: 16383518 [PubMed - indexed for MEDLINE]

188. Phys Rev E Stat Nonlin Soft Matter Phys. 2005 Oct;72(4 Pt 2):046101. Epub 2005

Oct 3.

MultiFRACTAL structure in nonrepresentational art.

Mureika JR(1), Dyer CC, Cupchik GC.

Author information:

(1)Department of Physics, Loyola Marymount University, Los Angeles, California

90045-8227, USA.

MultiFRACTAL analysis techniques are applied to patterns in several abstract

expressionist artworks, painted by various artists. The analysis is carried out

on two distinct types of structures: the physical patterns formed by a specific

color ("blobs") and patterns formed by the luminance gradient between adjacent

colors ("edges"). It is found that the multiFRACTAL analysis method applied to

"blobs" cannot distinguish between artists of the same movement, yielding a

multiFRACTAL spectrum of dimensions between about 1.5 and 1.8. The method can

distinguish between different types of images, however, as demonstrated by

studying a radically different type of art. The data suggest that the "edge"

method can distinguish between artists in the same movement and is proposed to

represent a toy model of visual discrimination. A "FRACTAL reconstruction"

analysis technique is also applied to the images in order to determine whether or

not a specific signature can be extracted which might serve as a type of

fingerprint for the movement.

PMID: 16383462 [PubMed]

189. Phys Rev E Stat Nonlin Soft Matter Phys. 2005 Oct;72(4 Pt 2):045301. Epub 2005

Oct 14.

Cancellation exponent and multiFRACTAL structure in two-dimensional

magnetohydrodynamics: direct numerical simulations and Lagrangian averaged


Graham JP(1), Mininni PD, Pouquet A.

Author information:

(1)National Center for Atmospheric Research, P.O. Box 3000, Boulder, Colorado 80307,


We present direct numerical simulations and Lagrangian averaged (also known as

alpha model) simulations of forced and free decaying magnetohydrodynamic

turbulence in two dimensions. The statistics of sign cancellations of the current

at small scales is studied using both the cancellation exponent and the FRACTAL

dimension of the structures. The alpha model is found to have the same scaling

behavior between positive and negative contributions as the direct numerical

simulations. The alpha model is also able to reproduce the time evolution of

these quantities in free decaying turbulence. At large Reynolds numbers, an

independence of the cancellation exponent with the Reynolds numbers is observed.

PMID: 16383461 [PubMed]

190. Biol Cybern. 2006 Feb;94(2):149-56. Epub 2005 Dec 9.

MultiFRACTALity of decomposed EEG during imaginary and real visual-motor


Popivanov D(1), Stomonyakov V, Minchev Z, Jivkova S, Dojnov P, Jivkov S,

Christova E, Kosev S.

Author information:

(1)Institute of Physiology, Bulgarian Academy of Sciences, 1113 Sofia, Bulgaria.

We test the possible multiFRACTAL properties of dominant EEG frequency

components, when a subject tracks a path on a map, either only by eyes (imaginary

movement - IM) or by visual-motor tracking of discretely moving spot in regular

(RM) and Brownian time-step (BM) (real tracking of moving spot). We check the

hypotheses that the FRACTAL properties of filtered EEG (1) change with respect to

the law of spot movement; (2) differ among filtered EEG components and scalp

sites; (3) differ among real and imaginary tracking. Sixteen right-handed

subjects begin to perform IM, next--real spot tracking (RM and BM) following a

moving spot on streets of a citymap displayed on a computer screen, by push

forward/backward a joystick. Multichannel long-lasting EEG is band-pass filtered

for theta, alpha, beta and gamma oscillations. The

Wavelet-Transform-Modulus-Maxima-Method is applied to reveal multiFRACTALity

[local FRACTAL dimensions Dmax(h)] among task conditions, frequency bands and

sites. Non-parametric statistical estimation of the FRACTAL measures h (Dmax) is

finally applied. MultiFRACTALity is established for all experimental conditions,

EEG components and sites as follows among filtered components - anticorrelation

(h(Dmax) < 0.5) in beta and gamma, and long-range correlation (h(Dmax) > 0.5) for

theta and alpha oscillations; among tasks--for RM and BM, h (Dmax) differ

significantly whereas IM resembles mostly RM; among sites--no significant

difference for local FRACTAL properties is established. The results suggest that

for both imaginary and real visual-motor tracking a line, multiFRACTAL scaling,

specific for lower and higher EEG oscillations, is a very stable intrinsic one

for the activity of large brain areas. The external events (task conditions)

insert weak effect on the scaling.

PMID: 16341722 [PubMed - indexed for MEDLINE]

191. IEEE Trans Image Process. 2005 Oct;14(10):1435-47.

Image denoising based on wavelets and multiFRACTALs for singularity detection.

Zhong J(1), Ning R.

Author information:

(1)Department of Radiology, University of Rochester, Rochester, NY 14642, USA.

This paper presents a very efficient algorithm for image denoising based on

wavelets and multiFRACTALs for singularity detection. A challenge of image

denoising is how to preserve the edges of an image when reducing noise. By

modeling the intensity surface of a noisy image as statistically self-similar

multiFRACTAL processes and taking advantage of the multiresolution analysis with

wavelet transform to exploit the local statistical self-similarity at different

scales, the pointwise singularity strength value characterizing the local

singularity at each scale was calculated. By thresholding the singularity

strength, wavelet coefficients at each scale were classified into two categories:

the edge-related and regular wavelet coefficients and the irregular coefficients.

The irregular coefficients were denoised using an approximate minimum

mean-squared error (MMSE) estimation method, while the edge-related and regular

wavelet coefficients were smoothed using the fuzzy weighted mean (FWM) filter

aiming at preserving the edges and details when reducing noise. Furthermore, to

make the FWM-based filtering more efficient for noise reduction at the lowest

decomposition level, the MMSE-based filtering was performed as the first pass of

denoising followed by performing the FWM-based filtering. Experimental results

demonstrated that this algorithm could achieve both good visual quality and high

PSNR for the denoised images.

PMID: 16238050 [PubMed - indexed for MEDLINE]

192. Proc Biol Sci. 2005 Sep 7;272(1574):1815-22.

Metapopulations in multiFRACTAL landscapes: on the role of spatial aggregation.

Gamarra JG.

Author information:

Department of Natural Resources, Center for the Environment, Cornell University,

103 Rice Hall, Ithaca, NY 14853, USA.

The use of FRACTALs in ecology is currently pervasive over many areas. However,

very few studies have linked FRACTAL properties of landscapes to generating

ecological mechanisms and dynamics. In this study I show that lacunarity (a

measure of the landscape texture) is a well suited ecologically scaled landscape

index that can be explicitly incorporated in metapopulation models such as the

classical Levins equation. I show that the average lacunarity of an aggregated

landscape is linearly correlated to the habitat that a species with local spatial

processed information may perceive. Lacunarity is a computationally feasible

index to measure, and is related to the metapopulation capacity of landscapes. A

general approach to multiFRACTAL landscapes has been conceived, and some

analytical results for self-similar landscapes are outlined, including the

specific effect of landscape heterogeneity, decoupled from that of contagion by

dispersal. Spatially explicit simulations show agreement with the semi-implicit

method presented.

PMCID: PMC1559862

PMID: 16096094 [PubMed - indexed for MEDLINE]

193. Phys Rev E Stat Nonlin Soft Matter Phys. 2005 Jul;72(1 Pt 1):011913. Epub 2005

Jul 21.

Volatility of linear and nonlinear time series.

Kalisky T(1), Ashkenazy Y, Havlin S.

Author information:

(1)Minerva Center and Department of Physics, Bar-Ilan University, Ramat-Gan, Israel.

Previous studies indicated that nonlinear properties of Gaussian distributed time

series with long-range correlations, u(i), can be detected and quantified by

studying the correlations in the magnitude series |u(i)|, the "volatility."

However, the origin for this empirical observation still remains unclear and the

exact relation between the correlations in u(i) and the correlations in |u(i)| is

still unknown. Here we develop analytical relations between the scaling exponent

of linear series u(i) and its magnitude series |u(i)|. Moreover, we find that

nonlinear time series exhibit stronger (or the same) correlations in the

magnitude time series compared with linear time series with the same two-point

correlations. Based on these results we propose a simple model that generates

multiFRACTAL time series by explicitly inserting long range correlations in the

magnitude series; the nonlinear multiFRACTAL time series is generated by

multiplying a long-range correlated time series (that represents the magnitude

series) with uncorrelated time series [that represents the sign series sgn

(u(i))]. We apply our techniques on daily deep ocean temperature records from the

equatorial Pacific, the region of the El-Ninõ phenomenon, and find: (i)

long-range correlations from several days to several years with 1/f power

spectrum, (ii) significant nonlinear behavior as expressed by long-range

correlations of the volatility series, and (iii) broad multiFRACTAL spectrum.

PMID: 16090007 [PubMed - indexed for MEDLINE]

194. Phys Rev E Stat Nonlin Soft Matter Phys. 2005 May;71(5 Pt 2):056121. Epub 2005

May 27.

MultiFRACTAL properties of the harmonic measure on Koch boundaries in two and

three dimensions.

Grebenkov DS(1), Lebedev AA, Filoche M, Sapoval B.

Author information:

(1)Laboratoire de Physique de la Matière Condensée, C.N.R.S. Ecole Polytechnique,

91128 Palaiseau, France.

The multiFRACTAL properties of the harmonic measure on quadratic and cubic Koch

boundaries are studied with the help of a new fast random walk algorithm adapted

to these FRACTAL geometries. The conjectural logarithmic development of local

multiFRACTAL exponents is guessed for regular FRACTALs and checked by extensive

numerical simulations. This development allows one to compute the multiFRACTAL

exponents of the harmonic measure with high accuracy, even with the first

generations of the FRACTAL. In particular, the information dimension in the case

of the concave cubic Koch surface embedded in three dimensions is found to be

slightly higher than its value D1 =2 for a smooth boundary.

PMID: 16089616 [PubMed]

195. Chemosphere. 2006 Feb;62(6):934-46. Epub 2005 Aug 2.

Scaling characteristics in ozone concentration time series (OCTS).

Lee CK(1), Juang LC, Wang CC, Liao YY, Yu CC, Liu YC, Ho DS.

Author information:

(1)Department of Environmental Engineering, Green Environment R&D Center, Vanung

University, Chung-Li 320, Taiwan, ROC.

One-year series of hourly average ozone observations, which were obtained from

urban and national park air monitoring stations at Taipei (Taiwan), were analyzed

by means of descriptive statistics and FRACTAL methods to examine the scaling

structures of ozone concentrations. It was found that all ozone measurements

exhibited the characteristic right-skewed frequency distribution, cyclic pattern,

and long-term memory. A mono-FRACTAL analysis was performed by transferring the

ozone concentration time series (OCTS) into a useful compact form, namely, the

box-dimension (D(B))-threshold (T(h)) and critical scale (C(S))-threshold (T(h))

plots. Scale invariance was found in these time series and the box dimension was

shown to be a decreasing function of the threshold ozone level, implying the

existence of multiFRACTAL characteristics. To test this hypothesis, the OCTS were

transferred into the multiFRACTAL spectra, namely, the tau(q)-q plots. The

analysis confirmed the existence of multiFRACTAL characteristics in the

investigated OCTS. A simple two-scale Cantor set with unequal scales and weights

was then used to fit the calculated tau(q)-q plots. This model fitted remarkably

well the entire spectrum of scaling exponents for the examined OCTS. Because the

existence of chaos behavior in OCTS has been reported in the literature, the

possibility of a chaotic multiFRACTAL approach for OCTS characterization was


PMID: 16081138 [PubMed - indexed for MEDLINE]

196. J Neuroeng Rehabil. 2005 Aug 2;2:24.

Fractional Langevin model of gait variability.

West BJ(1), Latka M.

Author information:

(1)Mathematical and Informational Sciences Directorate US Army Research Office,

Research Triangle Park, NC 27709, USA.

The stride interval in healthy human gait fluctuates from step to step in a

random manner and scaling of the interstride interval time series motivated

previous investigators to conclude that this time series is FRACTAL. Early

studies suggested that gait is a monoFRACTAL process, but more recent work

indicates the time series is weakly multiFRACTAL. Herein we present additional

evidence for the weakly multiFRACTAL nature of gait. We use the stride interval

time series obtained from ten healthy adults walking at a normal relaxed pace for

approximately fifteen minutes each as our data set. A fractional Langevin

equation is constructed to model the underlying motor control system in which the

order of the fractional derivative is itself a stochastic quantity. Using this

model we find the FRACTAL dimension for each of the ten data sets to be in

agreement with earlier analyses. However, with the present model we are able to

draw additional conclusions regarding the nature of the control system guiding

walking. The analysis presented herein suggests that the observed scaling in

interstride interval data may not be due to long-term memory alone, but may, in

fact, be due partly to the statistics.

PMCID: PMC1224863

PMID: 16076394 [PubMed]

197. J Chem Phys. 2005 Jun 1;122(21):214725.

MultiFRACTAL analysis of dynamic potential surface of ion-conducting materials.

Habasaki J(1), Ngai KL.

Author information:

(1)Tokyo Institute of Technology, 4259 Nagatsuta-cho, Yokohama 226-8502, Japan.

A multiFRACTAL analysis using singularity spectra [T.C. Halsey et al., Phys. Rev.

A 33, 1141 (1986)] provides a general tool to study the temporal-spatial

properties of particles in complex disordered materials such as ions in ionically

conducting glasses and melts. Obtained by molecular-dynamics simulations, the

accumulated positions of the particles dynamically form a structural pattern

called the dynamical potential surface. In this work, the complex dynamical

potential surfaces of Li ions in the lithium silicates were visualized and

characterized by the multiFRACTAL analysis. The FRACTAL dimensions and strength

of the singularity related to the spatial intermittency of the dynamics are

examined, and the relationship between dynamics and the singularity spectra is


PMID: 15974780 [PubMed]

198. Hypertens Res. 2004 Dec;27(12):911-8.

A consistent abnormality in the average local smoothness of fetal heart rate in

growth-restricted fetuses affected by severe pre-eclampsia.

Yum MK(1), Kim K, Kim JH, Park EY.

Author information:

(1)Department of Pediatrics, School of Medicine, Hanyang University, Seoul, Korea.

An abnormality in cardiovascular regulation during the prenatal period has been

suggested to be the pathophysiological link between fetal growth restriction and

adult hypertension. The purpose of this study was to determine how consistently

abnormal the local smoothness of the very-short-term heart rate is in

growth-restricted fetuses associated with severe pre-eclamptic pregnancy.

MultiFRACTAL Hurst analysis on the structure function of heart rate was performed

in control fetuses (n =150), in fetuses affected by severe pre-eclampsia and not

showing growth restriction (n =66) and in fetuses affected by severe

pre-eclampsia and showing growth restriction (n =58). The very-short-term (< or

=15 heart beats) generalized Hurst exponents of the order of -5 to 5 in three

groups were compared. Each exponent quantifies an average local heart rate

smoothness at 15-successive-heart rate sites, which were specified by the

magnitude of the heart rate variation within the sites determined by and

positively correlated with the order of the exponent. This means that the fetal

heart rates within the sites of q > or =2 have a large fetal heart rate (FHR)

variation, and those within the sites of q < or =-2 have a small FHR variation.

In the fetuses affected by severe pre-eclampsia and not showing growth

restriction, only values of the exponents of the order > or =2 were abnormally

lower. In the fetuses affected by severe pre-eclampsia and showing growth

restriction, the values of the exponents of all orders were abnormally lower. In

conclusion, the local smoothness of heart rate is consistently abnormal

regardless of the magnitude of heart rate variation within a very-short-term

period in growth-restricted fetuses affected by severe pre-eclampsia.

PMID: 15894830 [PubMed - indexed for MEDLINE]

199. Appl Opt. 2005 Feb 1;44(4):527-33.

Superhydrophobic antireflective silica films: FRACTAL surfaces and laser-induced

damage thresholds.

Xu Y(1), Wu D, Sun YH, Huang ZX, Jiang XD, Wei XF, Li ZH, Dong BZ, Wu ZH.

Author information:

(1)State Key Laboratory of Coal Conversion, Institute of Coal Chemistry, Chinese

Academy of Sciences, Taiyuan 030001, China.

Several superhydrophobic antireflective silica films have been prepared by a

solgel method that uses hexamethyl-disilizane (HMDS) as a modifier. In a

high-power laser, laser-induced damage thresholds (LIDTs) of 23-30 J/cm2 were

obtained at 1064-nm wavelength with 1-ns pulse duration. By atomic-force

microscopy and optical microscopy, the FRACTAL surfaces of films were studied,

and multiFRACTAL spectra (MFSs) were calculated both before and after laser

damage. The two-sided effect of HMDS on particle growth determined the surface

FRACTAL of a particle and the multiFRACTAL structure of a film's surface. The

bigger deltaalpha was, both before and after laser damage, the lower the LIDT

was. The effect of methyl groups should be included in the determination of the

MFS of the LIDT.

PMID: 15726949 [PubMed]

200. J Theor Biol. 2005 Mar 21;233(2):191-8. Epub 2004 Nov 18.

Middle and long distance athletics races viewed from the perspective of


García-Manso JM(1), Martín-González JM, Dávila N, Arriaza E.

Author information:

(1)Departamento de Educación Física, Facultad de Ciencias de la Actividad Física y

el Deporte, Universidad de Las Palmas de Gran Canaria, 35017 Canary Islands,


Middle and long distance athletics races behave as power-laws when time (or

average speed) and distance are related, thus suggesting the presence of critical

phenomena. Power-laws as a function of the athlete's position in the all-time

world ranking allows us to define a Performance Index that reveals the existence

of possible multiFRACTAL structures associated to the natural barriers to that

the athletes tend in their evolution towards better results and in pursuit of

world records. The new theories of self-organized critical phenomena provide an

explanation for the power-law and FRACTAL structures in systems at, or near,

their critical points. In this paper we analyse the athletic races using these

theories and as a result of this study a new variety of interpretations are


PMID: 15619360 [PubMed - indexed for MEDLINE]

201. Biofizika. 2004 Nov-Dec;49(6):1075-83.

[Calculation of local Hurst exponents in the Ca(2+)-activated K(+)-channel dwell


[Article in Russian]

Brazhe AR, Astashev ME, Maksimov GV, Kazachenko VN, Rubin AB.

A novel method based on the maximum overlap wavelet transform of dwell time

series is proposed. Information on local multiFRACTAL properties of the series,

namely local Hurst exponents or Holder exponents, was obtained. The results

confirm the presence of multiFRACTALity and intrinsic correlations in the

Ca(2+)-activated K+ channel dwell time series. The data on the local multiFRACTAL

structure of the series can be interpreted in terms of processes having

self-organized criticality. The proposed approach allows one to widen the store

of methods for the analysis of single ion channel activity.

PMID: 15612549 [PubMed - indexed for MEDLINE]

202. J Theor Biol. 2005 Feb 21;232(4):559-67.

A FRACTAL method to distinguish coding and non-coding sequences in a complete

genome based on a number sequence representation.

Zhou LQ(1), Yu ZG, Deng JQ, Anh V, Long SC.

Author information:

(1)School of Mathematics and Computing Science, Xiangtan University,Hunan 411105,


A FRACTAL method to distinguish coding and non-coding sequences in a complete

genome is proposed, based on different statistical behaviors between these two

kinds of sequences. We first propose a number sequence representation of DNA

sequences. MultiFRACTAL analysis is then performed on the measure representation

of the obtained number sequence. The three exponents C(-1), C1 and C2 are

selected from the result of multiFRACTAL analysis. Each DNA may be represented by

a point in the three-dimensional space generated by these three-component

vectors. It is shown that points corresponding to coding and non-coding sequences

in the complete genome of many prokaryotes are roughly distributed in different

regions. Fisher's discriminant algorithm can be used to separate these two

regions in the spanned space. If the point (C(-1),C1,C2) for a DNA sequence is

situated in the region corresponding to coding sequences, the sequence is

discriminated as a coding sequence; otherwise, the sequence is classified as a

non-coding one. For all 51 prokaryotes we considered , the average discriminant

accuracies pc,pnc,qc and qnc reach 72.28%, 84.65%, 72.53% and 84.18%,


PMID: 15588636 [PubMed - indexed for MEDLINE]

203. Ultramicroscopy. 2004 Dec;102(1):51-9.

Influence of the atomic force microscope tip on the multiFRACTAL analysis of

rough surfaces.

Klapetek P(1), Ohlídal I, Bílek J.

Author information:

(1)Czech Metrology Institute, Okruzní 31, 638 00 Brno, Czech Republic.

In this paper, the influence of atomic force microscope tip on the multiFRACTAL

analysis of rough surfaces is discussed. This analysis is based on two methods,

i.e. on the correlation function method and the wavelet transform modulus maxima

method. The principles of both methods are briefly described. Both methods are

applied to simulated rough surfaces (simulation is performed by the spectral

synthesis method). It is shown that the finite dimensions of the microscope tip

misrepresent the values of the quantities expressing the multiFRACTAL analysis of

rough surfaces within both the methods. Thus, it was concretely shown that the

influence of the finite dimensions of the microscope tip changed mono-FRACTAL

properties of simulated rough surface to multiFRACTAL ones. Further, it is shown

that a surface reconstruction method developed for removing the negative

influence of the microscope tip does not improve the results obtained in a

substantial way. The theoretical procedures concerning both the methods, i.e. the

correlation function method and the wavelet transform modulus maxima method, are

illustrated for the multiFRACTAL analysis of randomly rough gallium arsenide

surfaces prepared by means of the thermal oxidation of smooth gallium arsenide

surfaces and subsequent dissolution of the oxide films.

PMID: 15556700 [PubMed - indexed for MEDLINE]

204. Biofizika. 2004 Sep-Oct;49(5):852-65.

[FRACTAL properies of gating in potential-dependent K+-channels in Lymnaea

stagnalis neurons].

[Article in Russian]

Kazachenko VN, Kochetkov KV, Astashev ME, Grinevich AA.

Sets of the channel open times, [tau(o)], and closed times, [tau(c)], and the

full set of the channel open and closed times, [tau(o), tau(c)], in the activity

of single voltage-dependent K+-channels in mollusc L. stagnalis neurons were

analyzed using the rescaled range analysis (Hurst method), fast Fourier and

wavelet transforms. It was found that the Hurst dependence for each time series

could be approximated by a polygonal line with at least two slopes: H1 and H2

(Hurst exponents). The averaged values of H1 and H2 for the sets [tau(o), tau(c)]

were equal to 0.61 +/- 0.03 and 0.83 +/- 0.11, respectively; for the [tau(o)]

sets H1 = 0.66 +/- 0.03 and H2 = 0.95 +/- 0.10; for the [tau(c)] sets, H1 = 0.62

+/- 0.05 and H2 = 0.85 +/- 0.10. In some cases, a third slope appeared on the

Hurst dependences. It was very variable and ranged between 0.5 and 1. The Hurst

exponents H1, H2, and H3 characterized short, intermediate, and long time ranges,

respectively. The ranges greatly varied from experiment to experiment. The data

obtained show that the channel openings and closings (gating process) represent a

persistent process correlated in time. The randomization of the time sets

resulted in a single slope, H, of 0.52 +/- 0.02 characteristic of random

processes. The results were confirmed by the fast Fourier and wavelet transforms.

In addition, possible voltage dependences of Hurst exponents and their

correlation with tau(o) and tau(c) were investigated. As a whole, single channel

activity may be characterized as a multiFRACTAL process with a slight voltage

dependence of the Hurst exponents.

PMID: 15526471 [PubMed - indexed for MEDLINE]

205. Phys Rev E Stat Nonlin Soft Matter Phys. 2004 Sep;70(3 Pt 2):035104. Epub 2004

Sep 21.

Where two FRACTALs meet: the scaling of a self-avoiding walk on a percolation


von Ferber C(1), Blavats'ka V, Folk R, Holovatch Y.

Author information:

(1)Theoretische Polymerphysik, Universität Freiburg, D-79104 Freiburg, Germany.

The scaling properties of self-avoiding walks on a d -dimensional diluted lattice

at the percolation threshold are analyzed by a field-theoretical renormalization

group approach. To this end we reconsider the model of Phys. Rev. Lett. 63, 2819

(1989)] and argue that via renormalization its multiFRACTAL properties are

directly accessible. While the former first order perturbation did not agree with

the results of other methods our analytic result gives an accurate description of

the available MC and exact enumeration data in a wide range of dimensions

2</=d</=6 .

PMID: 15524568 [PubMed]

206. J Neurol Sci. 2004 Oct 15;225(1-2):33-7.

Quantitative evaluation of age-related white matter microstructural changes on

MRI by multiFRACTAL analysis.

Takahashi T(1), Murata T, Omori M, Kosaka H, Takahashi K, Yonekura Y, Wada Y.

Author information:

(1)Department of Neuropsychiatry, Faculty of Medical Sciences, University of Fukui,

Fukui 910-1193, Japan.

MultiFRACTAL analysis has been applied to evaluate biological tissues, which are

composed of complex structures. We carried out multiFRACTAL analyses in a group

of healthy young and elderly subjects to examine age-related white matter

microstructural changes on T2-weighted MR images without any visible abnormal

intensity, and to correlate such changes with age-related cognitive decline.

Comparison between the two age groups showed that Deltaalpha (established as the

most suitable index of heterogeneity in our previous report) in the frontal

region was significantly higher in the elderly group, but no significant group

difference was found in Deltaalpha in the parieto-occipital region. The

Trail-Making Test score (a measure of executive dysfunction) was significantly

higher in the elderly group. In the elderly group, the Trail-Making Test score

was positively correlated with Deltaalpha in the frontal region, but not in the

parieto-occipital region. These results suggest that microstructural changes in

the white matter preferentially occur in the frontal region with normal aging,

and these changes are associated with executive cognitive decline reflective of

frontal-subcortical dysfunction.

PMID: 15465083 [PubMed - indexed for MEDLINE]

207. Phys Rev E Stat Nonlin Soft Matter Phys. 2004;70(1 Pt 2):016306. Epub 2004 Jul


Anomalous diffusion exponents in continuous two-dimensional multiFRACTAL media.

de Dreuzy JR(1), Davy P, Erhel J, de Brémond d'Ars J.

Author information:

(1)Géosciences Rennes, UMR CNRS 6118, Université de Rennes, Campus de Beaulieu,

35042 Rennes Cedex, France.

We study diffusion in heterogeneous multiFRACTAL continuous media that are

characterized by the second-order dimension of the multiFRACTAL spectrum D2,

while the FRACTAL dimension of order 0, D0, is equal to the embedding Euclidean

dimension 2. We find that the mean anomalous and fracton dimensions, d(w) and

d(s), are equal to those of homogeneous media showing that, on average, the key

parameter is the FRACTAL dimension of order 0 D0, equal to the Euclidean

dimension and not to the correlation dimension D2. Beyond their average, the

anomalous diffusion and fracton exponents, d(w) and d(s), are highly variable and

consistently range in the interval [1,4]. d(w) can be consistently either larger

or lower than 2, indicating possible subdiffusive and superdiffusive regimes. On

a realization basis, we show that the exponent variability is related to the

local conductivity at the medium inlet through the conductivity scaling.

PMID: 15324168 [PubMed]

208. Phys Rev E Stat Nonlin Soft Matter Phys. 2004 May;69(5 Pt 1):051919. Epub 2004

May 28.

Hierarchical structure in healthy and diseased human heart rate variability.

Ching ES(1), Lin DC, Zhang C.

Author information:

(1)Department of Physics, The Chinese University of Hong Kong, Shatin, Hong Kong.

It is shown that the healthy and diseased human heart rate variability (HRV)

possesses a hierarchical structure of the She-Leveque (SL) form. This structure,

first found in measurements in turbulent fluid flows, implies further details in

the HRV multiFRACTAL scaling. The potential of diagnosis is also discussed based

on the characteristics derived from the SL hierarchy.

PMID: 15244859 [PubMed - indexed for MEDLINE]

209. Phys Rev E Stat Nonlin Soft Matter Phys. 2004 Jun;69(6 Pt 2):066135. Epub 2004

Jun 23.

Percolation on a multiFRACTAL.

Corso G(1), Freitas JE, Lucena LS, Soares RF.

Author information:

(1)International Center for Complex Systems and Departamento de Física Teórica e

Experimental, Universidade Federal do Rio Grande do Norte, Campus Universitário

59078 970, Natal, RN, Brazil.

We investigate percolation phenomena in multiFRACTAL objects that are built in a

simple way. In these objects the multiFRACTALity comes directly from the

geometric tiling. We identify some differences between percolation in the

proposed multiFRACTALs and in a regular lattice. There are basically two sources

of these differences. The first is related to the coordination number, which

changes along the multiFRACTAL. The second comes from the way the weight of each

cell in the multiFRACTAL affects the percolation cluster. We use many samples of

finite size lattices and draw the histogram of percolating lattices against site

occupation probability. Depending on a parameter characterizing the multiFRACTAL

and the lattice size, the histogram can have two peaks. We observe that the

percolation threshold for the multiFRACTAL is lower than that for the square

lattice. We compute the FRACTAL dimension of the percolating cluster and the

critical exponent beta. Despite the topological differences, we find that the

percolation in a multiFRACTAL support is in the same universality class as

standard percolation.

PMID: 15244695 [PubMed]

210. Neuroimage. 2004 Jul;22(3):1195-202.

Wavelet-based multiFRACTAL analysis of fMRI time series.

Shimizu Y(1), Barth M, Windischberger C, Moser E, Thurner S.

Author information:

(1)MR Centre of Excellence, Medical University of Vienna, Austria.

Functional magnetic resonance imaging (fMRI) time series are investigated with a

multiFRACTAL method based on the Wavelet Modulus Maxima (WTMM) method to extract

local singularity ("FRACTAL") exponents. The spectrum of singularity exponents of

each fMRI time series is quantified by spectral characteristics including its

maximum and the corresponding dimension. We found that the range of Hölder

exponents in voxels with activation is close to 1, whereas exponents are close to

0.5 in white matter voxels without activation. The maximum dimension decreases

going from white matter to gray matter, and is lower still for activated time

series. The full-width-at-half-maximum of the spectra is higher in activated

areas. The proposed method becomes particularly effective when combining these

spectral characteristics into a single parameter. Using these multiFRACTAL

parameters, it is possible to identify activated areas in the human brain in both

hybrid and in vivo fMRI data sets without knowledge of the stimulation paradigm


Copyright 2004 Elsevier Inc.

PMID: 15219591 [PubMed - indexed for MEDLINE]

211. Neurol Neurochir Pol. 2003 Nov-Dec;37(6):1199-209.

[FRACTAL analysis of MCA blood flow velocity fluctuations in

migraine--preliminary report].

[Article in Polish]

Glaubic-Latka M(1), Latka M, Latka D, Bury W, Pierzchała K.

Author information:

(1)Oddziału Neurologii B Wojewódzkiego Zespołu Neuropsychiatrycznego w Opolu.

Many reports confirm the existence of long-range correlations between

fluctuations of various physiological signals in healthy subjects and demonstrate

disappearance of these correlations in pathological conditions. Blood flow

velocity in intracranial vessels is changeable over time and depends on complex

physiological regulatory mechanisms. The character of blood flow velocity

fluctuations may indicate the presence of vascular disorders associated with

various diseases. The aim of our study was to establish whether fluctuations in

MCA blood flow velocity are FRACTAL in physiological conditions and if so,

whether this feature is lost in migraine, as the role of vasomotoric disturbances

has been already evidenced in pathophysiology of this disease. The axial flow

velocity changes averaged over a cardiac beat interval were monitored

continuously via two channels through the temporal windows using a DWL Multi-DopT

TCD device with 2-MHz probes. The examinations were performed in supine rest in

two-hour periods in two groups: of 7 patients with clinically confirmed migraine

with aura during headache-free intervals (15 recordings), and in the control

group of 4 young, healthy volunteers (10 recordings). The results in the form of

time series were analysed using the methods of FRACTAL statistics.

MultiFRACTALity in the recordings in physiological conditions was clearly

confirmed, as well as its absence in the averaged recordings in the group of

migraneurs. The findings justify a supposition that the breakdown of multiFRACTAL

properties of MCA blood flow time series in migraine may result from the

vasomotor disturbances present even during headache-free intervals. However,

possible usefulness of this method in the diagnostics of migraine requires

further investigation.

PMID: 15174233 [PubMed - indexed for MEDLINE]

212. Comput Med Imaging Graph. 2004 Jun;28(4):203-11.

A digital reference model of the human bronchial tree.

Schmidt A(1), Zidowitz S, Kriete A, Denhard T, Krass S, Peitgen HO.

Author information:

(1)Image Processing Laboratory, Institute of Anatomy and Cell Biology,

Justus-Liebig-University, Aulweg 123, 35385 Giessen, Germany.

In-vitro preparations of the human lung combined with high-resolution tomography

can be used to derive precise models of the human lung. To develop an abstract

graph representation, specially adapted image processing algorithms were applied

to segment and delineate the bronchi. The graph thus obtained contains

topological information about spatial coordinates, connectivities, diameters and

branching angles of 1453 bronchi up to the 17th Horsfield order. The graph was

analyzed for statistical and FRACTAL properties and was compared with current

models. Results indicate a model that exhibits asymmetry and multiFRACTAL

properties. This newly established reference model is an important step forward

in geometrical accuracy of the bronchial tree representation that will improve

both analysis of lung images in clinical imaging and the realism of functional


PMID: 15121209 [PubMed - indexed for MEDLINE]

213. Eur Biophys J. 2004 Oct;33(6):535-42. Epub 2004 Mar 16.

Patterning of endocytic vesicles and its control by voltage-gated Na+ channel

activity in rat prostate cancer cells: FRACTAL analyses.

Krasowska M(1), Grzywna ZJ, Mycielska ME, Djamgoz MB.

Author information:

(1)Neuroscience Solutions to Cancer Research Group, Department of Biological

Sciences, Imperial College London, Sir Alexander Fleming Building, South

Kensington Campus, London, SW7 2AZ, UK.

FRACTAL methods were used to analyze quantitative differences in secretory

membrane activities of two rat prostate cancer cell lines (Mat-LyLu and AT-2) of

strong and weak metastatic potential, respectively. Each cell's endocytic

activity was determined by horseradish peroxidase uptake. Digital images of the

patterns of vesicular staining were evaluated by multiFRACTAL analyses:

generalized FRACTAL dimension (Dq) and its Legendre transform f(alpha), as well

as partitioned iterated function system -- semiFRACTAL (PIFS-SF) analysis. These

approaches revealed consistently that, under control conditions, all multiFRACTAL

parameters and PIFS-SF codes determined had values greater for Mat-LyLu compared

with AT-2 cells. This would agree generally with the endocytic/vesicular activity

of the strongly metastatic Mat-LyLu cells being more developed than the

corresponding weakly metastatic AT-2 cells. All the parameters studied were

sensitive to tetrodotoxin (TTX) pre-treatment of the cells, which blocked

voltage-gated Na+ channels (VGSCs). Some of the parameters had a "simple"

dependence on VGSC activity, whereby pre-treatment with TTX reduced the values

for the MAT-LyLu cells and eliminated the differences between the two cell lines.

For other parameters, however, there was a "complex" dependence on VGSC activity.

The possible physical/physiological meaning of the mathematical parameters

studied and the nature of involvement of VGSC activity in control of

endocytosis/secretion are discussed.

PMID: 15024523 [PubMed - indexed for MEDLINE]

214. Phys Rev E Stat Nonlin Soft Matter Phys. 2004 Jan;69(1 Pt 2):016309. Epub 2004

Jan 29.

Dual multiFRACTAL spectra.

Roux S(1), Jensen MH.

Author information:

(1)Laboratoire Surface du Verre et Interfaces, UMR CNRS/Saint-Gobain, 39 quai Lucien

lefranc, 93303 Aubervilliers cedex, France.

The multiFRACTAL formalism characterizes the scaling properties of a physical

density rho as a function of the distance L. To each singularity alpha of the

field is attributed a FRACTAL dimension for its support f(alpha). An alternative

representation has been proposed by considering the distribution of distances

associated to a fixed mass. Computing these spectra for a multiFRACTAL Cantor

set, it is shown that these two approaches are dual to each other, and that both

spectra as well as the moment scaling exponents are simply related. We apply the

same inversion formalism to exponents obtained for turbulent statistics in the

Gledzer-Ohkitani-Yamada shell model and observe that the same duality relation

holds here.

PMID: 14995714 [PubMed]

215. Phys Rev E Stat Nonlin Soft Matter Phys. 2004 Jan;69(1 Pt 1):011403. Epub 2004

Jan 27.

Diffusion-limited aggregation with power-law pinning.

Hentschel HG(1), Popescu MN, Family F.

Author information:

(1)Department of Physics, Emory University, Atlanta, Georgia 30322, USA.

Using stochastic conformal mapping techniques we study the patterns emerging from

Laplacian growth with a power-law decaying threshold for growth R(-gamma)(N)

(where R(N) is the radius of the N-particle cluster). For gamma>1 the growth

pattern is in the same universality class as diffusion limited aggregation (DLA),

while for gamma<1 the resulting patterns have a lower FRACTAL dimension D(gamma)

than a DLA cluster due to the enhancement of growth at the hot tips of the

developing pattern. Our results indicate that a pinning transition occurs at

gamma=1/2, significantly smaller than might be expected from the lower bound

alpha(min) approximately 0.67 of multiFRACTAL spectrum of DLA. This limiting case

shows that the most singular tips in the pruned cluster now correspond to those

expected for a purely one-dimensional line. Using multiFRACTAL analysis, analytic

expressions are established for D(gamma) both close to the breakdown of DLA

universality class, i.e., gamma less, similar 1, and close to the pinning

transition, i.e., gamma greater, similar 1/2.

PMID: 14995617 [PubMed]

216. J Biol Phys. 2004 Mar;30(1):33-81. doi: 10.1023/B:JOBP.0000016438.86794.8e.

Wavelet Analysis of DNA Bending Profiles reveals Structural Constraints on the

Evolution of Genomic Sequences.

Audit B(1), Vaillant C, Arnéodo A, d'Aubenton-Carafa Y, Thermes C.

Author information:

(1)Centre de Recherche Paul Pascal, avenue Schweitzer, 33600 Pessac, France.

Analyses of genomic DNA sequences have shown in previous works that base pairs

are correlated at large distances with scale-invariant statistical properties. We

show in the present study that these correlations between nucleotides (letters)

result in fact from long-range correlations (LRC) between sequence-dependent DNA

structural elements (words) involved in the packaging of DNA in chromatin. Using

the wavelet transform technique, we perform a comparative analysis of the DNA

text and of the corresponding bending profiles generated with curvature tables

based on nucleosome positioning data. This exploration through the optics of the

so-called `wavelet transform microscope' reveals a characteristic scale of

100-200 bp that separates two regimes of different LRC. We focus here on the

existence of LRC in the small-scale regime (≲ 200 bp). Analysis of genomes in the

three kingdoms reveals that this regime is specifically associated to the

presence of nucleosomes. Indeed, small scale LRC are observed in eukaryotic

genomes and to a less extent in archaeal genomes, in contrast with their absence

in eubacterial genomes. Similarly, this regime is observed in eukaryotic but not

in bacterial viral DNA genomes. There is one exception for genomes of Poxviruses,

the only animal DNA viruses that do not replicate in the cell nucleus and do not

present small scale LRC. Furthermore, no small scale LRC are detected in the

genomes of all examined RNA viruses, with one exception in the case of

retroviruses. Altogether, these results strongly suggest that small-scale LRC are

a signature of the nucleosomal structure. Finally, we discuss possible

interpretations of these small-scale LRC in terms of the mechanisms that govern

the positioning, the stability and the dynamics of the nucleosomes along the DNA

chain. This paper is maily devoted to a pedagogical presentation of the

theoretical concepts and physical methods which are well suited to perform a

statistical analysis of genomic sequences. We review the results obtained with

the so-called wavelet-based multiFRACTAL analysis when investigating the DNA

sequences of various organisms in the three kingdoms. Some of these results have

been announced in B. Audit et al. [1, 2].

PMCID: PMC3456503

PMID: 23345861 [PubMed]

217. Phys Rev E Stat Nonlin Soft Matter Phys. 2003 Dec;68(6 Pt 1):061509. Epub 2003

Dec 24.

FRACTAL patterns, cluster dynamics, and elastic properties of magnetorheological


Carrillo JL(1), Donado F, Mendoza ME.

Author information:

(1)Instituto de Física de la Universidad Autónoma de Puebla, Apartado Postal J-48,

Puebla 72570, Puebla, México.

We study pattern formation and the aggregation processes in magnetorheological

suspensions in the presence of a static magnetic field, and some of their

associated physical properties. In particular, we analyze the elastic modes as a

function of the intensity of the applied field and for several particle

concentrations. We observe that the clusters formed in these systems have

multiFRACTAL characteristics, which are the result of three well defined stages

of the aggregation process. In these stages three generations of clusters are

produced sequentially. The structure of the suspension can be well characterized

by its mass FRACTAL dimensions and the mass radial distribution. The size

distribution of the second-generation clusters written in terms of their mass

FRACTAL dimension allows us to calculate the sound speed of the longitudinal

modes in the large wavelength regime. This multiFRACTAL analysis applied to

several kinds of aggregates reveals that the occurrence of at least three stages

of aggregation is a common feature to several physical aggregation processes.

PMID: 14754214 [PubMed]

218. J Theor Biol. 2004 Feb 7;226(3):341-8.

Chaos game representation of protein sequences based on the detailed HP model and

their multiFRACTAL and correlation analyses.

Yu ZG(1), Anh V, Lau KS.

Author information:

(1)Program in Statistics and Operations Research, Queensland University of

Technology, G.P.O. Box 2434, QLD 4001, Brisbane, Australia

Similar to the chaos game representation (CGR) of DNA sequences proposed by

Jeffrey (Nucleic Acid Res. 18 (1990) 2163), a new CGR of protein sequences based

on the detailed HP model is proposed. MultiFRACTAL and correlation analyses of

the measures based on the CGR of protein sequences from complete genomes are

performed. The Dq spectra of all organisms studied are multiFRACTAL-like and

sufficiently smooth for the Cq curves to be meaningful. The Cq curves of bacteria

resemble a classical phase transition at a critical point. The correlation

distance of the difference between the measure based on the CGR of protein

sequences and its FRACTAL background is also proposed to construct a more precise

phylogenetic tree of bacteria.

PMID: 14643648 [PubMed - indexed for MEDLINE]

219. Int J Neurosci. 2003 Nov;113(11):1615-39.

Regular developmental changes in EEG multiFRACTAL characteristics.

Polonnikov RI(1), Wasserman EL, Kartashev NK.

Author information:

(1)Saint Petersburg Institute for Informatics and Automation of Russian Academy of

Sciences, Saint Petersburg, Russia.

Electroencephalograms (EEGs) of 110 pupils (aged 6.5-19.5 years; 48 healthy

subjects, 51 with cerebral palsy, 11 with acquired cerebral defect) were

acquired. The stable age dependences of averaged parameters of k x f-beta EEG

spectra model, deviations from the model, and normalized ranges of detrended EEGs

were found. These dependences were observed in both healthy subjects and

patients, in males and females, in the cases of congenital and acquired

pathology. This regularity reflects the process of cerebral maturation and

developmental change of structure types of the EEG process, and it is inherent to

normal as well as to abnormal brain development.

PMID: 14585757 [PubMed - indexed for MEDLINE]

220. Med Biol Eng Comput. 2003 Sep;41(5):543-9.

Multi- and monoFRACTAL indices of short-term heart rate variability.

Fischer R(1), Akay M, Castiglioni P, Di Rienzo M.

Author information:

(1)Department of Biomedical Engineering, Rutgers University, Piscataway, USA.

Indices of heart rate variability (HRV) based on FRACTAL signal models have

recently been shown to possess value as predictors of mortality in specific

patient populations. To develop more powerful clinical indices of HRV based on a

FRACTAL signal model, the study investigated two HRV indices based on a

monoFRACTAL signal model called fractional Brownian motion and an index based on

a multiFRACTAL signal model called multifractional Brownian motion. The

performance of the indices was compared with an HRV index in common clinical use.

To compare the indices, 18 normal subjects were subjected to postural changes,

and the indices were compared on their ability to respond to the resulting

autonomic events in HRV recordings. The magnitude of the response to postural

change (normalised by the measurement variability) was assessed by analysis of

variance and multiple comparison testing. Four HRV indices were investigated for

this study: the standard deviation of all normal R-R intervals; an HRV index

commonly used in the clinic; detrended fluctuation analysis, an HRV index found

to be the most powerful predictor of mortality in a study of patients with

depressed left ventricular function; an HRV index developed using the maximum

likelihood estimation (MLE) technique for a monoFRACTAL signal model; and an HRV

index developed for the analysis of multifractional Brownian motion signals. The

HRV index based on the MLE technique was found to respond most strongly to the

induced postural changes (95% CI). The magnitude of its response (normalised by

the measurement variability) was at least 25% greater than any of the other

indices tested.

PMID: 14572004 [PubMed - indexed for MEDLINE]

221. Phys Rev E Stat Nonlin Soft Matter Phys. 2003 Aug;68(2 Pt 1):021913. Epub 2003

Aug 22.

MultiFRACTAL and correlation analyses of protein sequences from complete genomes.

Yu ZG(1), Anh V, Lau KS.

Author information:

(1)Program in Statistics and Operations Research, Queensland University of

Technology, GPO Box 2434, Brisbane Q4001, Australia.

A measure representation of protein sequences similar to the measure

representation of DNA sequences proposed in our previous paper [Yu et al., Phys.

Rev. E 64, 031903 (2001)] and another induced measure are introduced.

MultiFRACTAL analysis is then performed on these two kinds of measures of a large

number of protein sequences derived from corresponding complete genomes. From the

values of the D(q) (generalized dimensions) spectra and related C(q) (analogous

specific heat) curves, it is concluded that these protein sequences are not

completely random sequences. For substrings with length K=5, the D(q) spectra of

all organisms studied are multiFRACTAL-like and sufficiently smooth for the C(q)

curves to be meaningful. The C(q) curves of all bacteria resemble a classical

phase transition at a critical point. But the "analogous" phase transitions of

higher organisms studied exhibit the shape of double-peaked specific heat

function. But for the classification problem, the multiFRACTAL property is not

sufficient. When the measure representations of protein sequences from complete

genomes are considered as time series, a method based on correlation analysis

after removing some memory from the time series is proposed to construct a

phylogenetic tree. This construction is shown to be reasonably satisfactory.

PMID: 14525012 [PubMed - indexed for MEDLINE]

222. Phys Rev E Stat Nonlin Soft Matter Phys. 2003 Sep;68(3 Pt 2):036129. Epub 2003

Sep 24.

Logarithmic corrections to scaling in critical percolation and random resistor


Stenull O(1), Janssen HK.

Author information:

(1)Department of Physics and Astronomy, University of Pennsylvania, Philadelphia,

Pennsylvania 19104, USA.

We study the critical behavior of various geometrical and transport properties of

percolation in six dimensions. By employing field theory and renormalization

group methods we analyze fluctuation induced logarithmic corrections to scaling

up to and including the next-to-leading order correction. Our study comprehends

the percolation correlation function, i.e., the probability that two given points

are connected, and some of the FRACTAL masses describing percolation clusters. To

be specific, we calculate the mass of the backbone, the red bonds, and the

shortest path. Moreover, we study key transport properties of percolation as

represented by the random resistor network. We investigate the average two-point

resistance as well as the entire family of multiFRACTAL moments of the current


PMID: 14524854 [PubMed]

223. Eur J Appl Physiol. 2003 Oct;90(3-4):305-16. Epub 2003 Aug 27.

Self-affine FRACTAL variability of human heartbeat interval dynamics in health

and disease.

Meyer M(1), Stiedl O.

Author information:

(1)Max Planck Institute for Experimental Medicine, Hermann-Rein Str 3, 37075

Göttingen, Germany.

The complexity of the cardiac rhythm is demonstrated to exhibit self-affine

multiFRACTAL variability. The dynamics of heartbeat interval time series was

analyzed by application of the multiFRACTAL formalism based on the Cramèr theory

of large deviations. The continuous multiFRACTAL large deviation spectrum

uncovers the nonlinear FRACTAL properties in the dynamics of heart rate and

presents a useful diagnostic framework for discrimination and classification of

patients with cardiac disease, e.g., congestive heart failure. The characteristic

multiFRACTAL pattern in heart transplant recipients or chronic heart disease

highlights the importance of neuroautonomic control mechanisms regulating the

FRACTAL dynamics of the cardiac rhythm.

PMID: 12942331 [PubMed - indexed for MEDLINE]

224. Health Phys. 2003 Sep;85(3):330-42.

Application of FRACTAL and morphological methods in radioecology.

Makarenko N(1), Karimova L, Novak MM.

Author information:

(1)Institute of Mathematics, 480100 Almaty, Kazakhstan.

Effective management of radioactive contamination requires comprehensive

knowledge of pollutants' characteristics. The complicated character of the

problem is due to a number of issues, such as the very wide range of

contamination, the presence of a mixture of radioactive isotopes, the highly

variable diffusion of radionuclides in soil, water, and air, and the effect of

climatic conditions. The resultant field has an irregular mosaic structure, which

restricts the choice of measurement methods and data processing. In view of this,

application of classical statistics techniques is often inappropriate in modeling

such an environment. Application of the tools of FRACTAL and stochastic geometry

provides a good insight and helps to distinguish between distribution

characteristics of natural and man-made isotopes. Several techniques are

implemented to determine scaling aspects of contaminated fields. The discovery of

multiFRACTAL scaling leads to the hierarchical structure of contamination spots

on different scales and intensity and places restrictions on the measurement net

for detecting anomalies. The method of stochastic geometry further demonstrates

that topological characteristics of contamination fields differ from those of the

Gaussian fields and the topology of man-made isotopes differs from natural ones.

PMID: 12938723 [PubMed - indexed for MEDLINE]

225. Acta Med Okayama. 2003 Apr;57(2):49-52.

Variations of multiFRACTAL structure in the fetal heartbeats.

Miyagi Y(1), Miyagi Y, Terada S, Kudo T.

Author information:

(1)Department of Obstetrics and Gynecology, Okayama University Graduate School of

Medicine and Dentistry, Okayama 700-8558, Japan.

Several procedures for evaluating fetal well-being are in clinical use. The

cardiotocograph is mostly used as a non-invasive procedure to measure fetal

well-being in clinical settings. The cardiotocograph displays the fetal heartbeat

counts that vibrate. This variation has been classified into 2 categories. We

investigated this variation by a novel method, in which we analyzed the change of

structure of the attractors in the phase spaces according to the time course. We

adopted the global spectrum, which means the distribution of FRACTAL dimensions,

for that structure. In this procedure, we discovered a new variation in which the

cycle is much longer than the 2 types of known variabilities. Although loud

noises such as white noises with a magnitude 1/4 times as large as the standard

deviation of the original data were added to the original data, the variations

were still detected. The variation is very difficult to detect by Fourier or

wavelet transformation, however, because it changes very slowly. Through this new

way of analyzing the vibration phenomena, we obtained a new perspective on the

biological information available.

PMID: 12866743 [PubMed - indexed for MEDLINE]

226. Dokl Biol Sci. 2003 Mar-Apr;389:143-6.

MultiFRACTAL analysis of the species structure of biotic communities.

Iudin DI(1), Gelashvili DB, Rozenberg GS.

Author information:

(1)Lobachevsky Nizhni Novgorod State University, Nizhni Novgorod, Russia.

PMID: 12854413 [PubMed - indexed for MEDLINE]

227. Phys Rev E Stat Nonlin Soft Matter Phys. 2003 Apr;67(4 Pt 1):042402. Epub 2003

Apr 17.

Scaling exponent of the maximum growth probability in diffusion-limited


Jensen MH(1), Mathiesen J, Procaccia I.

Author information:

(1)The Niels Bohr Institute, Blegdamsvej 17, Copenhagen, Denmark.

An early (and influential) scaling relation in the multiFRACTAL theory of

diffusion limited aggregation (DLA) is the Turkevich-Scher conjecture that

relates the exponent alpha(min) that characterizes the "hottest" region of the

harmonic measure and the FRACTAL dimension D of the cluster, i.e.,

D=1+alpha(min). Due to lack of accurate direct measurements of both D and

alpha(min), this conjecture could never be put to a serious test. Using the

method of iterated conformal maps, D was recently determined as D=1.713+/-0.003.

In this paper, we determine alpha(min) accurately with the result

alpha(min)=0.665+/-0.004. We thus conclude that the Turkevich-Scher conjecture is

incorrect for DLA.

PMID: 12786408 [PubMed]

228. Phys Rev E Stat Nonlin Soft Matter Phys. 2003 May;67(5 Pt 1):051917. Epub 2003

May 20.

Nonlinear dynamical model of human gait.

West BJ(1), Scafetta N.

Author information:

(1)Pratt School of EE Department, Duke University, and Mathematics Division, Army

Research Office, Research Triangle Park, North Carolina, USA.

We present a nonlinear dynamical model of the human gait control system in a

variety of gait regimes. The stride-interval time series in normal human gait is

characterized by slightly multiFRACTAL fluctuations. The FRACTAL nature of the

fluctuations becomes more pronounced under both an increase and decrease in the

average gait. Moreover, the long-range memory in these fluctuations is lost when

the gait is keyed on a metronome. Human locomotion is controlled by a network of

neurons capable of producing a correlated syncopated output. The central nervous

system is coupled to the motocontrol system, and together they control the

locomotion of the gait cycle itself. The metronomic gait is simulated by a forced

nonlinear oscillator with a periodic external force associated with the conscious

act of walking in a particular way.

PMID: 12786188 [PubMed - indexed for MEDLINE]

229. Phys Rev E Stat Nonlin Soft Matter Phys. 2003 Mar;67(3 Pt 2):036702. Epub 2003

Mar 24.

Models for correlated multiFRACTAL hypersurfaces.

Tavares DM(1), Lucena LS.

Author information:

(1)International Center for Complex Systems and Departamento de Física Teórica e

Experimental-UFRN, Natal-RN 59078-970, Brazil.

We discuss and implement computer approximations of FRACTAL and multiFRACTAL

hypersurfaces. These hypersurfaces consist of reconstructions of a stochastic

process in the real space from randomly distributed variables in the discrete

wavelet domain. The synthetic surfaces have the usual fractional Brownian motion

as a particular case, and inherit the correlation structure of these FRACTALs. We

first introduce the one-dimensional version of these surfaces that obey a weak

self-affine symmetry. This symmetry appears in the wavelet domain as a condition

on the second moments of the probability distributions of the wavelet

coefficients. Then we use these relations to define the FRACTALs and

multiFRACTALs in d dimensions. Finally, we concentrate on the generation of

samples of these hypersurfaces.

PMID: 12689197 [PubMed]

230. Phys Rev E Stat Nonlin Soft Matter Phys. 2002 Dec;66(6 Pt 1):061906. Epub 2002

Dec 18.

MultiFRACTAL analysis of DNA walks and trails.

Rosas A(1), Nogueira E Jr, Fontanari JF.

Author information:

(1)Instituto de Física de São Carlos, Universidade de São Paulo, Caixa Postal 369,

13560-970 São Carlos, SP, Brazil.

The characterization of the long-range order and FRACTAL properties of DNA

sequences has proved a difficult though rewarding task mainly due to the mosaic

character of DNA consisting of many interwoven patches of various lengths with

different nucleotide constitutions. We apply here a recently proposed

generalization of the detrended fluctuation analysis method to show that the DNA

walk construction, in which the DNA sequence is viewed as a time series, exhibits

a monoFRACTAL structure regardless of the existence of local trends in the

series. In addition, we point out that the monoFRACTAL structure of the DNA walks

carries over to an apparently alternative graphical construction given by the

projection of the DNA walk into the d spatial coordinates, termed DNA trails. In

particular, we calculate the FRACTAL dimension D(t) of the DNA trails using a

well-known result of FRACTAL theory linking D(t) to the Hurst exponent H of the

corresponding DNA walk. Comparison with estimates obtained by the standard

box-counting method allows the evaluation of both finite-length and local trends


PMID: 12513317 [PubMed - indexed for MEDLINE]

231. J Theor Biol. 2003 Jan 7;220(1):75-82.

The multiFRACTAL structure of arterial trees.

Grasman J(1), Brascamp JW, Van Leeuwen JL, Van Putten B.

Author information:

(1)Wageningen University and Research Centre, Biometrics, Postbus 100, 6700 AC

Wageningen, The Netherlands.

FRACTAL properties of arterial trees are analysed using the cascade model of

turbulence theory. It is shown that the branching process leads to a non-uniform

structure at the micro-level meaning that blood supply to the tissue varies in

space. From the model it is concluded that, depending on the branching parameter,

vessels of a specific size contribute dominantly to the blood supply of tissue.

The corresponding tissue elements form a dense set in the tissue. Furthermore, if

blood flow in vessels can get obstructed with some probability, the above set of

tissue elements may not be dense anymore. Then there is the risk that, spread out

over the tissue, nutrient and gas exchange fall short.

Copyright 2003 Elsevier Science Ltd.

PMID: 12453452 [PubMed - indexed for MEDLINE]

[Dr. Pellionisz is legally permitted to practice Compensated Professional Services (Analysis, Advisorship, Consultantship, Board Membership, etc) as long as there is no "Conflict of Interest", through holgentech_at_gmail_dot_com.

Communication regarding Intellectual Property of any kind, including but not limited to patents, trade secrets, know-how associated with Dr. Pellionisz must be strictly gated by "Attorney Kevin Roe, Esq. FractoGene Legal Department", mailing address (USPS/UPS/FedEx) "155 E Campbell Ave, Campbell, CA 95008"]

23andMe aims to be Google for genetic research

By Heather

POSTED: 09/06/2014 05:00:28 PM UPDATED: 5 DAYS AGO

MOUNTAIN VIEW -- In less than a decade, biotech company 23andMe has turned a refrigerator full of spit into one of the largest databases of personal genetics information in the world.

The brainchild of Anne Wojcicki, the wife of Google co-founder Sergey Brin, 23andMe began in 2006 as a startup mailing DNA testing kits to customers' front doors and asking them to mail back a vial of saliva. Eight years later, the company is the gatekeeper of a database of hundreds of thousands of people's DNA -- a self-described Google for genetics information.

"It's actually bigger than anything else I can think of, way bigger," said Lisa Brooks, program director of the National Human Genome Research Institute, part of the National Institutes of Health.

23andMe has begun selling that genetics data to researchers and pharmaceutical companies to conduct large-scale medical studies, making it an emerging leader in a largely underexplored, and at times hotly debated, area of scientific research. In the last couple of months, 23andMe has announced a joint effort with Pfizer to research inflammatory bowel disease, released findings from a joint study of more than 100,000 people that made new discoveries on Parkinson's disease, and received a $1.4 million grant from the NIH.

But as the guardian of a very lucrative set of data -- the accuracy of which has come under question -- critics say the Mountain View company also may pose a threat to consumers' privacy.

Most medical studies take months or years to solicit enough volunteers. But 23andMe puts the genetic information of 700,000 people at researchers' fingertips, allowing medical studies to be fast-tracked and new treatments to make their way into hospitals sooner, experts say, giving patients with chronic diseases a better quality of life.

"Instead of actually having to do clinical trials the old-fashioned way, we can enable researchers to get their answers instantaneously," Wojcicki said in an interview with this newspaper. "And they pay us for that."

But some experts worry 23andMe users have no idea where their own genetic information will end up. Because the company is relying on data sales to become profitable -- selling $99 home genetic testing kits doesn't pull in the big-dollar revenue -- 23andMe may disseminate consumers' genetic information not only to government agencies and research institutions, say legal and bioethics experts, but also to big pharmaceutical companies, marketers and advertisers.

"There are a lot of people who would want to use that data. There's a lot of money potentially locked up in that data," said Charles Seife, a professor at New York University and longtime science writer.

Indeed, in a 2013 interview with The New York Times, Wojcicki said, "I remember in the early days of Google, Larry (Page) would say, 'I just want the world's data on my laptop.' I feel the same way about health care. I want the world's data accessible."

Some research experts also question the accuracy of 23andMe's data, which it collects through surveys it sends to customers, asking about their family and medical histories.

The validity of these studies "is going to depend on the accuracy of the medical information in 23andMe's database," said Dr. Paul Appelbaum, director of the Division of Law, Ethics and Psychiatry at Columbia University. "The information 23andMe has is exclusively self-reported, so how accurate it is will become a critical point."

Regardless, the NIH and the companies buying the data don't see a problem: "It doesn't need to be perfect to be helpful," said Brooks of the NIH. "23andMe has managed at no cost to the government to amass a very large set of data on people."

23andMe, which has raised more than $126 million and has the backing of Google's venture arm and investments from Brin, as well as Russian billionaire and Facebook and Twitter investor Yuri Milner, has proved a much more powerful research engine than almost any public effort to understand human genetics, which tend to suffer from underfunding. By comparison, a national genetics study in the United Kingdom is collecting DNA from 500,000 people, and the NIH genome project has 1,000 genome samples.

The NIH grant will allow 23andMe to further expand and refine its database so that it will eventually become a portal of genetics information researchers can access with a keystroke.

The boost from the NIH followed a protracted battle with the Food and Drug Administration, which in November compelled 23andMe to stop offering some of its genetic testing services after regulators questioned the accuracy of the results and warned of the danger of consumers receiving life-changing health information that doesn't come from a doctor. By that time, 23andMe had the DNA of 550,000 customers.

Customers must sign a consent form for 23andMe to use their genetics information in medical research -- and about 85 to 90 percent of customers do, Wojcicki said, explaining that most customers suffer from or have an interest in a genetic disease and want to contribute to scientific research.

But Appelbaum questions how many customers fully understand what they are agreeing to: "That consent rate is quite high. Most studies conducted in academic settings have much lower consent rates," he said. "That raises the question of how well 23andMe describes what it is that they are asking permission for."

The company sells genetics data for research only in aggregate "to minimize the possibility of exposing individual-level information," according to the privacy policy. But customers can also agree to share their personal information, and when given permission, 23andMe will give out details such as their name, email, height, eye color and birth date. The company could not say how many of its customers agree to that level of disclosure.

The privacy policy doesn't limit with whom 23andMe can share data and stipulates the company can "enter into commercial arrangements" with other companies for the purpose of selling or offering products and services.

Spokeswoman Catherine Afarian said 23andMe asks customers "in every single specific instance in which we want to use personal information" and "there is no scenario in which someone could give us a blanket permission. We believe you own your information. It's your data, you should have access to it and you should have control over it."

But in a 2012 study from the Massachusetts Institute of Technology, researchers proved they could determine the identities of some customers of two online genetics testing companies simply by studying their DNA samples, tracing their ancestry to figure out their last names and doing an Internet search. That led researchers to call for stricter data-sharing policies for companies such as 23andMe.

Wojcicki isn't surprised or fazed by the criticism: "23andMe is forging new ground," she said. "I'm not here just to try and make a quick buck. We're code-breaking here, and it's a really complex problem."

Contact Heather Somerville at 510-208-6413. Follow her at

[It is now in the books of history, that "Anne Wojcicki picked up the Genome Baby", in 2006 (when even the first ENCODE wasn't finished, thus Junk DNA and Central Dogma "ruled"; took real guts for IT to pick it up):

YouTube of 2008

Today, in Mountain View 23andMe aims to be the Google of Genetic Research, meanwhile Google Genomics seems committed to take off - but Venter just snatched Franz Och from Google, claiming that what they will do in genome analytics to fight cancer will make Google "Child Play". Next door, also in Mountain View, Complete Genomics is now a subsidiary of China's BGI (see the same YouTube citing that Complete Genomics planned to also build their own "Google Type Data Center")... This peak season of the USA business from Labor Day till Thanksgiving will be very interesting. More off-line... AJP; holgentech_at_gmail_dot_com]

Mayo Clinic, IBM Collaborate to Match Patients with Clinical Trials

GenomeWeb Daily News

September 10, 2014

NEW YORK (GenomeWeb) – The Mayo Clinic and IBM this week announced a pilot program to use IBM's Watson to match patients with clinical trials.

The initiative is in its proof-of-concept stage and Watson is being familiarized with clinical trial terms. Further down the road, genetic and genomic information may be included in the patient-matching process, a Mayo Clinic spokesperson told GenomeWeb Daily News.

Mayo said that enrolling patients in clinical trials has been a challenge, and at the clinic just 5 percent of patients take part in clinical studies. Nationally, the enrollment is just 3 percent, Mayo added.

Additionally, the process is done manually with clinical coordinators sorting through patient records and conditions to match appropriate patients with studies. Mayo conducts more than 8,000 human studies, and Watson could accelerate and simplify the matching process by sifting through available Mayo clinical trials to ensure that more patients are accurately and consistently matched with clinical trial options.

The version of Watson to be used in the collaboration will be designed specifically for Mayo and as it moves through the collaboration, Watson will learn more about the clinical trial process, becoming more efficient and "likely more generalizable," Mayo said. Watson may also be able to identify patients for trials that are especially difficult to recruit patients for, such as those involving rare diseases.

"With shorter times from initiation to completion of trials, our research teams will have the capacity for deeper, more complete investigations," Nicholas LaRusso, a Mayo gastroenterologist and the project lead on the Mayo-IBM collaboration, said in a statement. "Coupled with increased accuracy, we will be able to develop, refine, and improve new and better techniques in medicine at a higher level."

Mayo and IBM are working to expand Watson's knowledgebase to include all clinical trials at Mayo, as well as those in public databases such as Additionally, the partners are exploring other applications for Watson in the future.

[Dr. Pellionisz is legally permitted to practice Compensated Professional Services (Analysis, Advisorship, Consultantship, Board Membership, etc) as long as there is no "Conflict of Interest", through holgentech_at_gmail_dot_com.

Communication regarding Intellectual Property of any kind, including but not limited to patents, trade secrets, know-how associated with Dr. Pellionisz must be strictly gated by "Attorney Kevin Roe, Esq. FractoGene Legal Department", mailing address (USPS/UPS/FedEx) "155 E Campbell Ave, Campbell, CA 95008"]

Venter steals top scholar from Google
University of San Diego News

By Gary Robbins2:55 P.M.JULY 29, 2014

La Jolla geneticist J. Craig Venter has hired one of the world’s top computer scientists to help him try to prolong and improve people’s lives by deciphering hundreds of thousands of human genomes.

Franz Och was lured away from Mountain View-based Google, where he has been guiding Google Translate, a service that is capable of translating more than 80 languages. The software has more than 200 million active users and translates everything from Yiddish to Swahili to English.

Och, 42, will help find ways to make it faster and easier for scientists to sift through the extraordinary amounts of data produced by sequencing. He will serve as chief data scientist at Human Longevity, Inc., a La Jolla company that Venter founded earlier this year to conduct the largest sequencing effort ever undertaken. The company will initially sequence 40,000 human genomes a year, then ramp up to 100,000.

“I basically did a search and tracked down the person who led the Google Translate effort, which I see as similar to the challenges we face with genomics,” said Venter, who helped lead, and speed up, the Human Genome Project.

“The six billion letters of the genome represents one of the biggest translation issues ever. Your genetic code translates into your biological code which translates into you. We need to use machine learning to find associations between genes that mere mortals can’t find from staring at the data. It’s too complex.”

Venter also needs help deciphering the tremendous amount of data that will be generated by examing microbiomes, or the countless microbes found on the human body. Scientists believe that such microbes can affect people’s health in ways large and small.

Och will operate out of Mountain View rather than moving to La Jolla.

“San Diego is phenomenal place for recruiting biologically-oriented scientists,” Venter said Tuesday. “But the Silicon Valley for people into computation science. So rather than try to convince a few hundred people to love to La Jolla, we’re just going to build on the talent base in the Silican Valley.”

Computational scientists also work at Human Longevity in La Jolla, and a facility Venter is opening in Singapore.

In a statement, Och said, “We’re going to need the best and brightest from the areas of computer science, machine learning and big data generation and interpretation as well as those from biology, genomics and bioinformatics to reach a new level of understanding of this massive database.

“I look forward to working with Craig and the team at HLI to enhance our understanding of human biology, to better manage the healthy aging process and thus increase the healthy human lifespan.”

[Dr. Pellionisz is legally permitted to practice Compensated Professional Services (Consultantship, Board Membership, etc) as long as there is no "Conflict of Interest", through holgentech_at_gmail_dot_com. Communication regarding Intellectual Property of any kind, including but not limited to patents, trade secrets, know-how associated with Dr. Pellionisz must be strictly gated by "Attorney Kevin Roe, Esq. FractoGene Legal Department", mailing address (USPS/UPS/FedEx) "155 E Campbell Ave, Campbell, CA 95008"]

End of Summer - Beginning of The New Era of Global Industrial Bidding War

September 6th is 2 years after the "shell shock" of "Junk Genomics" gone forever. (ENCODE I in 2007 established that "the genome is much more complex than we have ever imagined"). It took another 5 years for the message to sink in. By September 6th, 2012, the "Labor Day Type Fireworks" by the admission of ENCODE-II that "at least 80% of the human DNA is functional", along with the earlier statement by Craig Venter that "our concepts of genome regulation are frighteningly unsophisticated", as of this fall an entirely new era, a global industrial competition is afoot, led by Samsung, Sony/Illumina, Panasonic and BGI in Asia, Siemens and SAP in Europe and traditional health-care IT giants in the USA of HP, Dell, Intel, Oracle etc. More recent to the fervor are Calico (bringing together Apple with Google), and Google Genomics now in a head-on competition with the new genome analytics player, Craig Venter (both in Mountain View, along with BGI-fully owned Complete Genomics). While the ENCODE I-II took 9 years, statistics shows that in 2005 7.6 million people died of the genome regulation disease (cancer), a projected 9 million will be dead in 2015 of cancer, and in 2030 the number is projected to be 11.4 million. Thus, about 70 million humans died during ENCODE I-II, the period to clinch the need for software-enabling algoritms for totally available computers for cancer genome analysis. Who will take responsibility for any undue delays? One can monitor a death at about every 3 seconds.

[Dr. Pellionisz is legally permitted to practice Compensated Professional Services (Consultantship, Board Membership, etc) as long as there is no "Conflict of Interest", through holgentech_at_gmail_dot_com. Communication regarding Intellectual Property of any kind, including but not limited to patents, trade secrets, know-how associated with Dr. Pellionisz must be strictly gated by "Attorney Kevin Roe, Esq. FractoGene Legal Department", mailing address (USPS/UPS/FedEx) "155 E Campbell Ave, Campbell, CA 95008"]

Genome-based Personalization Industry

2014 July 1st, amidst of a deafening silence that ensued the crashing collapse of the twin-towers of ENCODE I and II, marks a crucial transition of genomics as a branch of Academia to Genome-based Personalization Industry. A tell-tale sign was a Report (by April, 2014) of the Personalized Medicine Coalition.


4 2014

where the agenda was made rather obvious:

Common wisdom had it, till today, that Government-supported Academia, working together with Big Pharma will deliver. The thesis here is that the assumptions are no longer true. It will be "Genome-based Personalization Industry", once the ecosystem matures, that will "create or buy" the IT-based pharma, to compete for the new market of personalized medicine. The turning point today is Germany. While for too long Germany was a "DNA ultraconservative superpower", they are rich enough to afford the super-expensive new anti-cancer drugs, yet smart enough not to let their government-supported socialized health-care waste money on 75% of the drugs that are, in a particular case, ineffective. This is an absolute, global "game changer" [AJP]

'Game-Changing' Cancer Benefit Launching

SAP will soon be offering employees a first-of-its-kind personalized tumor-analysis and treatment-option benefit. Experts weigh in on its prognosis.

By Kristen B. Frasch

Tuesday, July 1, 2014

Walldorf, Germany-based SAP will soon be launching a first-ever, company-sponsored cancer-analysis tool that -- if SAP has anything to say about it – could soon become commonplace as an employee benefit.

Called TreatmentMAP and developed by The Woodlands, Texas (U.S.)-based MolecularHealth, it will essentially offer a personalized treatment option to SAP employees fighting cancer -- starting as a pilot in two countries and eventually spreading companywide.

Using next-generation-sequencing genetic testing, TreatmentMap will create tumor analyses and clinical interpretations of the genomic-patient data to help employees' doctors and oncologists establish individualized treatment options in much less time than medicinal trials, such as chemotherapy.

In light of today's personalized-medicine trend, says Dr. Natalie Lotzmann, SAP's chief medical officer, "it's just a matter of time" before other companies adopt similar programs to help their employees with cancer fight the disease with the greatest possible outcome. (Indeed, SAP's own press release about says the company "hopes to convince other employers" to consider offering such a benefit.)

"Receiving a cancer diagnosis is a personal tragedy" that all companies will have to bear at some point, Lotzmann says. "The need for this tool will only increase [and] I believe will one day be the standard for all medical benefits at all companies."

Mind you, this is not a cure. But it could be a game-changer. Huge amounts of data generated from each tumor analysis would be processed with an ultra-fast genome-alignment algorithm -- part of the SAP Genomic Analyzer -- and would be used to assemble the full DNA sequence in three minutes, about 300 times faster than the alignment software previously used. From that data, the tool then immediately matches that specific genetic makeup with all the research available in the world indicating what was tried to treat this particular type of cancer.

"It's a decision-making tool," says Lotzmann, adding that the information attached to it is continually changing and being augmented. "This ensures the cancer patient gets the latest and greatest of treatments, the treatment that has been proven to work best," rather than one physicians or cancer centers must arrive at through trial and error.

So game-changing is this, says Dr. Friedrich von Bohlen, chairman of MolecularHealth, that "with the advent of information-based molecular-genetic diagnosis, cancer can be individually and precisely profiled, and potentially transformed from a terminal to a chronic disease."

And this, he adds, is a win-win, "supporting both the employee and his or her organization during the challenge of working and living with cancer."

In fact, Laura Housman, senior vice president and chief commercial officer in MolecularHealth's Boston office (the company's global headquarters are still in Heidelberg, Germany) sees this cancer-analysis benefit as a "competitive differentiator ... that can have a real impact on the employer and can impact shareholders" as well.

"That your company has the generosity and forethought, and is at the forefront to anticipate these needs going forward, that speaks volumes to current and prospective employees," she says, "and can enhance your reputation in general."

Traditional testing methods are only taking patients so far, says Housman, who agrees more insurers and employers will no doubt be providing something like the new tool SAP is using in the near future in their benefits programs.

"From an employer perspective," she says, "cancer is an increasing burden for both employers and patients." Indeed, the Alexandria, Va.-based American Society for Clinical Oncology anticipates a 45-percent increase in new cancer cases by 2030, and a 35-percent increase between now and 2020 in the number of cancer survivors due to better treatment.

"You have to figure, a significant number of these people will be employees somewhere," Housman says.

So what should HR leaders be keeping in mind about this latest chapter in the ongoing and never-ending employee-benefits evolution? Both Lotzmann and Housman note that privacy with employees' genetic profiles needn't be a concern, as TreatmentMap facilitates the knowledge share with clinicians and oncologists, but doesn't store the genetic information.

"MolecularHealth is a [Health Insurance Portability and Accountability Act]-compliant provider," Housman says, "and this information is only shared between provider and patient -- and as far as payment goes, [the parties] are de-identified."

Though total cost was not discussed, both say early adoptions such as this are always more expensive and should be reduced over time as familiarity and use increase. They also stress the importance of gaining C-level buy-in before jumping on any bandwagon, and making sure HR and benefits leaders are involved.

Of course, as with anything this new, "the clinical efficacy of such tumor analyses needs ongoing study," says Thomas Parry, president of the San Francisco-based Integrated Benefits Institute.

Matching drugs to patients may still depend on clinical trials, he says, just on a "broader range of people and perhaps of outcomes that are important both to employees and to their employers -- as in absence from work, disability and performance."

The notion that tumor analysis may eliminate trials and experiments, he adds, "ignores the fact that prior research is needed to understand how to get the right match of drugs to patients."

Send questions or comments about this story to

De-Bullshitting Big Data (Stanford-Oxford) - says Greg Kovacs

On May 21-23 at Stanford University (co-anchored by Oxford) the major "Big Data in Biomedicine" rather exclusive top-level world-conference took place.

Some highlights were the "jaw-dropping" presentation by Stanford professor Greg Kovacs who summed-up the Big Agenda o

f by a single pithy word ("Algorithms." - albeit framed in a language that one may find offensive, though a sense of frustration is palpable and perhaps even read with empathy):

It is unclear at this minute if the Agenda will pan out as pictured below:

A reason for interest was another "game changer" presentation of Google Genomics (by David Glazer). A publicly available posting (below, viewed by 95,135) by a team-player of Google Genomics (David Konerding, present, but not presenting) raised a question (voiced by momentarily third party to both Google or BGI, found "deep"). The question was if reverse-engineering of Nature-made systems is similar to reverse-engineering of totally man-made (and algorithmically transparent) systems, such as the Internet.

[The conspicuous presentation by Prof. Kovacs of Stanford University will be posted by mid-June on the BigData website. Viewers will thus determine for themselves how a perhaps frustrated tone in 2014, after the fact of the almost mindless rush for "The Dreaded DNA Data Deluge" that consumed billions of dollars worth of data-gathering (and loss of valuation of data-gathering companies), can be compared to a whistle-blowing in 2008; focusing on the public question broadcast as a Google Tech Talk YouTube "Is IT Ready for the Dreaded DNA Data Deluge?" As for an entire category of "Algorithms", The Principle of Recursive Genome Function (and utility of its subset of Fractal Iterative Recursion) has been put forward in 2008 and IP is now in force till late March, 2026 - AJP]

Your Genes Are Obsolete

Pacific Standard

The Science of Society

BY MICHAEL WHITE • May 02, 2014 • 8:00 AM

Genes don’t consistently do what we once thought they would, so it’s time to reconsider what we mean when we say the word.

[Some established and already international award-winning informatics specialists never believed either of the disarmingly naive "Central" or "Junk DNA" dogmas - and mathematically redefined the genome in the non-Euclidean geometrical software-enabling terms of nonlinear dynamics, drawing the consequences of the utility of "Fractal genome grows fractal organisms". Now that "Junk DNA is anything but" and "Genes are obsolete", it may be worthwile to turn to FractoGene, see timeline of its first decade since 2002 here.]

Today, DNA is central to modern biology, but scarcely a century ago biologists were debating whether or not genes actually existed. In his 1909 textbook on heredity, Danish botanist Wilhelm Johannsen coined the term gene to refer to that hereditary “something” that influences the traits of an organism, but without making a commitment to any hypothesis about what that “something” was. Just over a decade later, a prominent biologist could still note that some people viewed genes as “a convenient fiction or algebraic symbolism.”

As the century progressed, biologists came to see genes as real physical objects. They discovered that genes have a definite size, that they are linearly arrayed on chromosomes, that individual genes are responsible for specific chemical events in the cell, and that they are made of DNA and written in the language of the Genetic Code. By the time the Human Genome Project was initiated in 1988, researchers knew that a gene was a segment of DNA with a clear beginning and end and that it acted by directing the production of a particular enzyme or other molecule that did a specific job in the cell. As real things, genes are countable, and in 1999 biologists estimated that humans had “80,000 or so” of them.Yet, when the dust from the Human Genome Project cleared, we didn’t have nearly as many genes as we thought. By the latest count, we have 20,805 conventional genes that encode enzymes and other proteins. Our inflated gene count, though, wasn’t the only casualty of the Human Genome Project. The very idea of a gene as a well-defined segment of DNA with a clear functional role has also taken a hit, and as a result, our understanding of our relationship with our genes is changing.

One major challenge to the concept of a gene is the growing evidence that many genes are shapeshifters. Instead of a well-defined segment of DNA that encodes a single protein with a clear function, we should view a gene as “a polyfunctional entity that assumes different forms under different cellular states,” according to University of Washington biologist John Stamatoyannopoulos. While researchers have long known that genes are made up of discrete subunits called “exons,” they hadn’t realized until recently the degree to which exons are assembled—like Legos—into sometimes thousands of different combinations. With new technologies, biologists are cataloging these various combinations, but in most cases they don’t know whether those combinations all serve the same function, different functions, or no function at all.

Our concept of a gene is also challenged by the fact that much of the function in our DNA is located outside of conventionally defined genes. These “non-coding” functional DNA segments regulate when and where conventional protein-coding genes operate. For our biology, non-coding regulatory DNA elements are as consequential as genes, but their properties are even more difficult to define because their function isn’t based on the well-understood Genetic Code and their boundaries are even fuzzier than gene boundaries. As a result, non-coding regulatory DNA elements are much more difficult to count. One consortium of researchers put the number of regulatory DNA segments in the human genome between 580,000 and 2.9 million, while just last month a different consortium claimed that there are only 43,000. Regardless of how you count them, it’s clear that these non-gene regulatory DNA elements far outnumber conventional genes. It is hard not to wonder, then, what good is the concept of a gene if it doesn’t include most of our functional DNA?

In the aftermath of the Human Genome Project, biologists are struggling with the definition of a gene, but why should this matter to anyone else? It matters because the molecular concept of the gene that has dominated biomedical research for the last half-century is increasingly ill-suited for our efforts to understand the role of genetics in human biology. Giving a physical meaning to the concept of a gene was a triumph of 20th-century biology, but as it turns out, this scientific success hasn’t solved the problems we hoped it would.

The Human Genome Project was conceived as part of a research program to develop a set of clear molecular explanations for our biology. The idea was to inventory all of our genes and assign each of them a function; with this annotated inventory in hand, we would possess a molecular explanation of our genetic underpinnings and discover druggable target genes for specific diseases. While this gene-focused approach has been successful in many cases, it’s increasingly clear that we will never understand the role of genetics in our biology by merely making an annotated inventory of those DNA entities that we call genes.

Life isn’t so simple, and perhaps Wilhelm Johannsen’s more agnostic definition of a gene is a better match to the mixed bag of genetic elements in our genomes. The molecular concept of a gene was supposed to explain the influence of our DNA on our biology, our behaviors, and our ailments. That explanation is much more elusive than we hoped, and the role of DNA in our lives is more complex and subtle than we expected.

[Indeed, see the non-trivial but software-enabling algorithmic elaboration here]

Small non-coding RNAs could be warning signs of cancer

HEIDELBERG, 17 February 2014 – Small non-coding RNAs can be used to predict if individuals have breast cancer conclude researchers who contribute to The Cancer Genome Atlas project. The results, which are published in <em">EMBO reports, indicate that differences in the levels of specific types of non-coding RNAs can be used to distinguish between cancerous and non-cancerous tissues. These RNAs can also be used to classify cancer patients into subgroups of individuals that have different survival outcomes.

Small non-coding RNAs are RNA molecules that do not give rise to proteins but which may have other important functions in the cell. “For many years, small non-coding RNAs near transcriptional start sites have been regarded as ‘transcriptional noise’ due to their apparent chaotic distribution and an inability to correlate these molecules with known functions or disease,” explains Steven Jones, one of the lead authors of the study, a professor at Simon Fraser University and the University of British Columbia, and a distinguished scientist at the BC Cancer Agency. “By using a computational approach to analyze small RNA sequence information that we generated as part of The Cancer Genome Atlas project, we have been able to filter through this noise to find clinically useful information,” adds Jones. “The data from our experiments show that genome-wide changes in the expression levels of small non-coding RNAs in the first exons of protein-coding genes are associated with breast cancer.”

The scientists were able to distinguish between the many different small non-coding RNAs that are found near the transcriptional start sites of genes in healthy individuals and patients with breast cancer (in this case, breast invasive carcinoma). They mapped these RNA molecules to specific locations on the DNA sequence and looked for correlations between the non-coding RNAs that were strongly expressed and the disease status of the patients from whom the tissue samples were isolated. The researchers then tested if the expression of the small RNAs in genomic locations that they were able to identify could be used to predict the presence of disease in another group of tissue samples obtained from patients known to have breast cancer. The test efficiently predicted the correct disease status for the samples in the new study group.

“The potential to predict cancer status is restricted to only a subset of the many small non-coding RNAs found near transcription start sites of the genes. What’s more, these RNA locations are highly enriched with CpG islands,” says Athanasios Zovoilis, the first author of the study. CpG islands are genomic regions that contain a high frequency of cytosine and guanine. The presence of these RNAs in these islands may implicate their involvement with DNA methylation processes and the onset of disease but additional experiments are needed to explore and prove this link.

“This is the first time that small non-coding RNAs near the transcription start site of genes have been associated with disease,” says Jones. “Further work is required but based on our data we believe there is considerable diagnostic potential for these small non-coding RNAs as a predictive tool for cancer. In addition, they may help us understand better the mechanisms underlying oncogenesis at the epigenetic level and lead to potential new drugs employing small non-coding RNAs.” The researchers also note that this class of small non-coding RNAs may be useful in predicting the existence of other types of cancer or disease.

The generation of data by The Cancer Genome Atlas project, which now provides access to large amounts of sequencing information for diseased and normal tissues, made the work possible. The Cancer Genome Atlas is now one of the largest resources for small non-coding RNAs in existence.

The Brave New World of Medicine

Vivek Wadhwa

For The Washington Post

Monday, April 14, 2014

Health care is a misnomer for our medical system. It should be called sick care. Doctors, hospitals and pharmaceutical companies only make money when we are in bad health. If we could instead prevent illness and disease, it would turn the entire medical system on its head and increase the quality of our lives.

The good news is that technology is on its way to letting us do this. It is now moving so rapidly that within a decade the small handheld medical reader used by Dr. Leonard McCoy in Star Trek — the tricorder — will look primitive. We are moving into an era of data-driven, crowd-sourced, participatory, genomics-based medicine. Just as our bathroom scales give us instant readings of our weight, wearable devices will monitor our health and warn us when we are about to get sick. Our doctors — or their artificial intelligence replacements — will prescribe medicines or lifestyle changes based on our full medical history, holistic self and genetic composition.

It wasn’t long ago when our only recourse when we doubted our doctor’s prescription was to seek a second opinion. Now when we need information about an ailment we search on the Internet. We have access to more medical knowledge than our doctors used to have via their medical books and journals, and our information is more up-to-date than those medical books were. We can read about the latest medical advances anywhere in the world. We can visit online forums to learn from others with the same symptoms, provide each other with support and discuss the side effects of our medicines. We can download mobile applications that help us manage our health. All of this can be done by anyone with a smartphone.

Our smartphones also contain a wide array of sensors, including an accelerometer that keeps track of our movement, a high-definition camera that can photograph external ailments and transmit them for analysis, and a global positioning system that knows where we have been. Wearable devices such as Fitbit, Nike and Jawbone are commonly being used to monitor the intensity of our activity; a heart monitor such as one from Alivecor can display our electrocardiogram; several products on the market can monitor our blood pressure, blood glucose, blood oxygen, respiration and even our sleep. Soon we will have sensors that analyze our bowel and bladder habits and food intake. All of these will feed data into our smartphones and cloud-based personal lockers. Our smartphone will become a medical device akin to the Star Trek tricorder.

When we get sick, we won’t need to go — in high temperature and in severe pain — to our doctors’ offices, only to wait in line with patients who have other diseases that we may catch. Our doctors will come to us, over the Internet. Telemedicine is already a fast-growing field; doctors have been assisting people in remote areas by using two-way video, email and smartphones. They will increasingly assist us in our homes. Our smartphone and body sensors will provide them with better medical data than they usually have today.

Then our smartphones will evolve further and do part of the job of doctors.

The same type of artificial intelligence technology that IBM Watson used to defeat champions on the TV game show Jeopardy will monitor our health data, predict disease and advise on how to improve our health. Already, IBM Watson has learned about all the advances in oncology and is better at diagnosing cancer than our human doctors. Watson and its competitors will soon learn about every other field of medicine, and will provide us with better, and better-informed, advice than our doctors do. They will take a more holistic view of our bodies, lifestyles and symptoms than our doctors can. They will, after all, have our full medical history from childhood, know where we have been, and keep track of our medical data on a minute-by-minute basis. Most doctors still work from brief, unintelligible, hand-scribbled notes and try to make a judgment about what medicines to prescribe us in a 10- to 15-minute consultation; they treat symptoms of interest but can overlook the bigger picture of where the treatment leads.

Artificial intelligence technologies will also analyze continual data from millions of patients and on the medications that they have taken to determine which of these truly had a positive effect; which simply created adverse reactions and new ailments; and which did both. This will transform the way in which drugs are tested and prescribed. In the hands of independent researchers, these data will upend the pharmaceutical industry — which works on limited clinical-trial data and sometimes chooses to ignore information that does not suit it.

This is just the tip of the iceberg.

We learned how to sequence the genome about a decade ago, and sequencing it cost billions. Today a full human genome sequence costs as little as $1,000. At the rate at which prices are dropping, it will cost less within five years than a blood test does today. So it is now becoming affordable to compare one person’s DNA with another’s, learn what diseases those with similar genetics have had in common, and discover how effective different medications or other interventions were in treating them. Today, medicines are prescribed on a one-size-fits-all basis. In the future, you can expect to see doctors tailor treatment for diseases on the basis of an individual’s genomic information and lifestyle.

We can also now “write” DNA. In the emerging field of synthetic biology, researchers and even high-school students, are creating new organisms and synthetic life forms. Entrepreneurs have developed software tools to “design” DNA. These technologies provide the ability to generate designer drugs, therapeutic vaccines and microorganisms. Like all technologies that modify fundamental biology without a complete understanding of how environment, DNA, protein production and cell biology interact, this introduces new risks because we could engineer dangerous new organisms. But, used appropriately, this field may dramatically affect the development of novel, and more effective, therapeutics.

Ultimately, disease prevention is about lifestyle and habits as well as about genome and exposure to disease. Technology combined with good habits can create the health care system that we really need. We’re not dependent on Big Pharma, the medical establishment, or even the Food and Drug Administration. Medicine has become an information technology. The advances in health care are being developed by entrepreneurs and scientists all over the world. There is no stopping this.

Vivek Wadhwa is a fellow at Rock Center for Corporate Governance at Stanford University, director of research at Duke University, and distinguished scholar at Singularity and Emory universities.

RUSSIA WAKES UP TO GENOMICS - IBinom introduces revolutionary genomic data interpreter that's fast, efficient, user-friendly, and affordable


MOSCOW, RUSSIA - Mar 26, 2014 - Moscow startup iBinom has released the beta version of a cloud-based SaaS solution for clinical interpretation of human genome and exome data. 

As more clinicians and researchers come to rely upon genetic data in their quest to combat thousands of devastating human diseases and conditions, accurate and speedy analysis of that constantly expanding body of data becomes increasingly critical. Geared to the physician or geneticist who isn't necessarily a computer geek - and that would probably describe most professionals in those fields - iBinom's beta version is currently available for free at

This should be welcome news to medical centers, research centers, and sequence providers everywhere, notes iBinom's co-founder, Valery Ilinsky.

Ilinsky explains, "iBinom has developed one of the most precise algorithms presently available to determine rare pathogenic mutations among millions of non-pathogenics. Our solution is the fastest available as well - only 30 minutes from start to results." At present, he adds, tools offered by his firm's closest competitor take at least three hours to accomplish only part of the task that iBinom's data interpreter completes in half an hour. In addition, iBinom provides users with a clear report that does not require a programmer or a bioinformatician to understand.

Co-founded Ilinsky and the company's CEO Andrey Afanasyev, iBinom currently has offices in Moscow and St. Petersburg in Russia, as well as in Los Angeles. The firm has assembled a group of highly skilled programmers, several of whom have won global competitions in algorithms and programming, such as TopCoder, CodeForces, ACM, ICPC, IPSC, Google Code Jam. Ilinsky says iBinom anticipates a brisk demand for its service as prospective customers discover the advantages it provides over tools currently in use.

Ilinsky explains that iBinom was designed to address a dual problem faced by genetics researchers and clinicians alike: an overabundance of data, and a need to make sense of that data. "The number of sequenced genomes is growing exponentially, and if it keeps increasing at the current rate, approximately 25 million genomes will have been sequenced by 2016," he says. "The incorporation of whole-genome and whole-exome sequencing into clinical practice will undoubtedly change the way genetic counselors and other clinicians approach genetic testing."


Many clinicians are limited by currently available tools because human genome sequencing for clinical purposes is associated with detailed and precise Big Data analysis, Ilinsky notes. Initial raw data includes approximately 200GB of genetic data per patient, which is analyzed using either desktop applications or SaaS. However, Ilinsky says, current methods require special knowledge to interpret this massive amount of data, and do not provide medical reports. This greatly delays total turnaround time and decreases the sequencing capacity of service providers, as genome analyzers currently in use can take many hours to generate raw data for only one humangenome.

By contrast, iBinom offers a simpler, faster - and ultimately more affordable - means of genomic data interpretation. The user-friendly interface was designed for physicians and geneticists, but no special programming or bioinformatics expertise is required, and a medical report may be downloaded in one click. In addition to its fast proprietary algorithms of data analysis, the use of Amazon and Yandex IaaS (Infrastructure as a Service) has provided iBinom with the edge that allows production of a single human genome analysis in only half an hour.

Pricing is expected to be flexible and affordable compared to what is currently available. iBinom's SaaS business model allows customers to pay a fixed price for each analysis and a monthly fee for storage.

Says Ilinsky, "iBinom has devoted a great deal of time, effort, and expertise to developing advanced algorithms for accurate, speedy analysis of all of 30 000 human genes that could be linked to an estimated 3,000 inherited conditions with known genetic origin. We anticipate that iBinom will finally take next-generation sequencing out of the laboratories and into the hospitals and clinics, who will finally be able to routinely use exome sequencing and whole genome sequencing to diagnose their patients and manage their care."

[In itself, a Moscow-based start-up (with an office in Los Angeles...) may not appear such a strategic shift as it signals. The Moscow-based company, using "proprietary algorithms" apparently for exomics at this early time-point, is likely to be "stealth", under the radar. Similarly, not so many thought ahead of the game when in a dilapidated shoe-factory Beijing Genome Institute started to receive $Bn-s of government subsidy since "we are not interested in the white people's disease, but we are interested in the Chinese people's disease - besides, Genomics is of strategic interest". Russia is rather obviously waking up these days to her own "strategic interests". Why would be the Russian (Slavic) genome be strategically so important? Most importantly (similar to the genome of about 9 Chinese tribes), the homogeneity, and thus vulnerability of Slavic/Chinese genomes is orders of magnitude more exposed - compared to the rather extremely heterogeneous genome-pool of e.g. India (an answer why they spend much less on genomics compared to Pakistan). The American genome-pool is one of the most heterogenous "salad bowl" on the Earth - but it has access to practically all specific "genomes", and at the moment might still be in the lead to understand and thus deploy "genome regulation" with unexpected precision. Are the Chinese and the Russians (etc) interested in advanced IT "proprietary solutions"? You may want to note that Mr. Snowden, after a stop-over in China and now in Russia appears to be on his way to become a "Honorary Doctor" of a University of Germany. Interesting. - AJP]

From 23-28 June 2014, the Institute of Cytology and Genetics, Novosibirsk, Russia, will conduct the regular International Conference on Bioinformatics of Genome Regulation and Structure\Systems Biology (BGRS\SB-2014).

[Akademgorodok, the "scientific research center tucked safely away" is 20 miles from Novosibirsk, "Capital of Siberia". With a supercomputer available also for genomics, they have recently obtained an additional $1Bn. - Dr. Pellionisz]


Google invites geneticists to upload DNA data to cloud

SFgate, March 18, 2014

Stephanie M. Lee

Updated 5:04 am, Tuesday, March 18, 2014

Googling a person is about to take on a completely new meaning.

The Mountain View search giant recently invited geneticists to upload information to the company's cloud infrastructure. Google also provided scientists with instructions on how to import, process, store and, of course, search DNA data that could unlock clues to curing diseases.

Google's foray into genomics could open big markets for a company that has already made substantial investments in health care.

"This whole area, by the way of genome analysis, is really in general a hot area," said George Geis, an adjunct professor who specializes in technology mergers and acquisitions at UCLA's Anderson School of Management.

A MarketsandMarkets report valued the global genomics market at $11 billion in 2013 and predicted it will reach $19 billion by 2018. The North American health care cloud-computing market is also expected to grow fast in that time, up 30 percent to $6.5 billion.

Enormous databases

Sequencing the human genome can reveal deadly mutations as well as pathways for life-saving drugs. Since one individual's genome can add up to about 100 gigabytes of data, researchers must perform rigorous analysis to pry insights out of enormous databases.

At the same time, scientists and companies must answer questions about where to store the information, who should have access and how to protect privacy.

To help address those concerns, Google recently joined the Global Alliance for Genomics and Health, a coalition of health care providers, research universities, life science firms and others. The group, which met for the first time this month, is trying to encourage the industry to pool resources and establish standards on how to manage the data.

Expanding into health

For now, Google is providing the genetics community with Web services for free. But "it could very well be something that provides a top-line revenue for them going forward," Geis said.

The company is quickly expanding into health and medicine. Google owns Calico, a biotech company developing technologies to extend human life. In January, Google unveiled a prototype contact lens designed to help diabetics monitor their blood glucose. It also uses aggregated search data to estimate flu activity in more than 25 countries, and once operated a personal health-record service that let users create profiles for their health conditions, medications, allergies and lab results.

Someday, Google could use genetic data to make its own medical discoveries, Geis said.

"It could very well transform Google into another type of company - it partners with a pharmaceutical company or licenses its discoveries and patents its discoveries," he said.

But Google faces competition from companies large and small already in the genomics-analysis field.

For two years, Amazon's cloud service has hosted the 1,000 Genomes Project, the world's largest database of human genetics. Although the data are public and free, Amazon does charge researchers to use its high-powered computing resources to run data calculations.

'Lots of room'

"Generally speaking, the cloud space is still in its infancy and there's lots of room, certainly, for industrial players to try to apply their particular implementation of cloud technologies to new areas," said Ramon Felciano, co-founder of Ingenuity Systems. The Redwood City company, which was recently acquired by Qiagen, makes software for analyzing biological data.

DNANexus in Mountain View also provides cloud storage for genetic data. Chief Cloud Officer Omar Serang said the company joined the Global Alliance for Genomics and Health because scientists and companies can't advance research without a common format.

"Information is incredibly siloed," Felciano said.

The sensitive nature of genetic data prevents easy collaboration due to patients' privacy concerns.

"The whole field of genetic research, especially genetic research based off large data sets or well-sequenced genomes, is ripe with privacy and ethical issues," said Lee Tien, a senior staff attorney at the Electronic Frontier Foundation in San Francisco.

Privacy standards

While hospitals and academic institutions must adhere to patient privacy rules, private companies do not necessarily follow those standards, Tien said.

Members of the Global Alliance for Genomics and Health are working to develop common policies that govern ethics, data storage and security. For its part, Google says genomic data stored in its cloud is secure.

"Private data remains private, public data is available to the community anywhere," the company said on its website.

Tien remans cautious.

"It may all go well," he said. "But in other cases there will be surprises - and you don't really want surprises with genetic data."

Stephanie M. Lee is a San Francisco Chronicle staff writer. E-mail:


March 24, 2014 | By Nick Paul Taylor

Google's ($GOOG) expansions into life sciences and genomics have raised two big, as yet unanswered, questions--how will they affect the industry, and what do they mean for the company? This week, the San Francisco Chronicle looked into possible answers to both questions.

In the past 6 months Google has set up a biotech focused on aging and a genomics cloud services platform, but both projects are still taking shape. It is still too early to tell whether they will succeed or join Google's big scrap heap of axed projects, but University of California, Los Angeles' George Geis sees reasons for optimism. Geis, who specializes in technology mergers and acquisitions, thinks Google Genomics has the potential to give the search giant new sources of revenues.

"It could very well transform Google into another type of company--it partners with a pharmaceutical company or licenses its discoveries and patents its discoveries," Geis told SF Chronicle. Such a scenario currently appears a distant prospect, but is one possible outcome of Google applying its mission to "organize the world's information and make it universally accessible and useful" to genomics.

Whether people are comfortable with Google organizing the world's genomic data is still questionable, though. Revelations of National Security Agency snooping on tech giants' data centers have made some people wary of trusting trusting the likes of Google with their search histories, let alone their genome data. And the results of a recent survey and the outcry over the English patient record database have shown many people are uncomfortable with companies profiting from their health data.

Earlier this month MedCity News reported on an exchange between technology columnist Kara Swisher and 23andMe CEO Anne Wojcicki at SXSW that sums up the concerns. "I don't like the idea of Google having my gut bacteria on file because they could monetize it, you know they could," Swisher said.

New details on brains and budget behind Google's Calico emerge

October 10, 2013 | By Nick Paul Taylor

Google ($GOOG) generated a deluge of headlines when it unveiled its antiaging venture Calico but gave very few details about the nuts and bolts of the company. Now, reports on the questions Calico will look into--and how much cash it will have to answer them--have started to emerge.

Calico is the brainchild of Google Ventures' managing partner Bill Maris, Fortune reports. Maris--a former biotech portfolio manager at Investor AB--noted a lack of companies trying to stop the degradation of genetic material that ultimately causes cells to fail. Efforts to raise cash for a company to fill this gap led to discussions with Google cofounder Sergey Brin and eventually a decision by the search giant to fund the entire project.

Exactly how much Google is committing to Calico is unclear, but sources told Fortune it is at the very least hundreds of millions of dollars. Google has indicated it views this as a long-term investment--with the payoff coming 10 to 20 years down the line--and there are suggestions it is creating more of a research institute than a traditional biotech. Calico is unlikely to be rushing a candidate into Phase I.

Instead, it will reportedly look into questions like what elements are shared by the genomes of thousands of healthy 90-year-olds from all parts of the world and how we can use this information to extend the lives of the broader population. The vision of Calico that has emerged in early media reports is one of a company underpinned by the genetic breakthroughs of the past decade and likely to use computing power to extract fresh insights from the data.

[UCLA Professor George Geis is absolutely right. The "ecosystem" of genomics has changed forever. As pharma-guru Karoly Nikolich cited (at 1:04 minutes of my Churchill Club YouTube of Genome Computer panel); "sooner of later a big IT company, like Google, will buy or build their own "pharma"". Now it is up to Samsung, Sony/Olympus/Illimina, Siemens, Google, Apple, GE health, Cisco, Calico, Longevity etc. "to leverage" Old Pharma. The magazine Newsweek, however, is also absolutely right - "you can not cure what you can not understand". Thus, as all IT know, without buying/building the algorithmic IP the blood bath ahead might be similar to the Internet "search engine legacy" (let the best algorithm rule not only understanding of the net, but ruling the business - the core question may be what is the (fractal recursive) mathematical algorithm of genome function?). "Privacy" is not the name of the game in cancer - "Survival" (precision medicine) would be a more likely answer by e.g. the late Steve Jobs. - Pellionisz]

Dr. Erez Aiden named newest McNair Scholar at Baylor College of Medicine

Glenna Picton

(713) 798-7973

Houston, TX - Mar 26, 2014

Award-winning scientist Dr. Erez Aiden has been named the newest McNair Scholar at Baylor College of Medicine.

The McNair Scholar program at Baylor identifies established and rising stars in biomedical research to be recruited to Baylor. The program is supported by the Robert and Janice McNair Foundation and managed by the McNair Medical Institute.

Aiden and his collaborators invented the Hi-C method for three-dimensional genome sequencing and Aiden subsequently led the team that reported the first three dimensional map of the human genome. His lab continues to develop powerful new technologies and methods for interrogating genomes in three dimensions.

With funding from the McNair Scholar Program and the Cancer Prevention and Research Institute of Texas, Aiden was recruited to Baylor as an assistant professor of genetics. He holds a joint appointment as an assistant professor of computer science and applied mathematics at Rice University.

Aiden received his undergraduate degree from Princeton University; completed a Master of Arts degree in history from Yeshiva University; and completed a Master of Arts degree in Applied Physics as well as a Ph.D. in Applied Math and Health Sciences and Technology from Harvard University and the Massachusetts Institutes of Technology in Cambridge. Subsequently, Aiden was a fellow at the Harvard Society of Fellows and a visiting faculty member at Google, Inc.

His research has won numerous awards including the Lemelson-MIT prize for best student inventor at MIT; membership in Technology Review's 2009 TR35, recognizing the top 35 innovators under 35; and a National Institutes of Health New Innovator Award.

In 2012, he received the President's Early Career Award in Science and Engineering, the highest government honor for young scientists, awarded by the President of the United States. His work has been featured on the front page of the New York Times, the Boston Globe, and the Wall Street Journal, and his online talks have been viewed over a million times. Fast Company recently called Aiden “America's brightest young academic.”

[Some readers may become confused about "discovery of genomes in three dimensions". Now that old axioms have fallen to the degree that "modern genomics is a well-dressed gentleman with no shoes", anything overthrew the three-dimensional "double helix" (Franklin, Watson, Wilkinson and Crick, 1953?). The reader may rest assured that the three-dimensional "double helix" stands strong as ever! So what is the fuss about "three dimensionality"? Actually, nothing was discovered lately about "three dimensional folding of the DNA strand", since Grosberg's papers some two decades ago. The spectacular invention by the genius (Erez Aiden) is the so-called "Hi-C method" (see reference below) to map out the functional proximity of parts of the 2m-long human DNA noodle; arriving at the astonishing proof that the linearly distant parts are functionally extremely close. The FractoGene approach to genome function (by Pellionisz, "fractal genome grows fractal organism) since 2002, see timeline, explicitly required by The Principle of Recursive Genome Function (2008) that the genome is red in a massively parallel fashion (and it was a total misunderstanding that just because the DNA strand is arranged and transcribed linearly, it would follow that the function would have to serial as well). Just like a book where letters are arranged one-by-one from cover-to-cover, the page-system of organization permits to jump from the Table of Contents to the middle or end-part of the book, with basically identical ease (see more elaborate explanation at "Mr. President - the Genome is Fractal!)]

J Vis Exp. 2010 May 6;(39). pii: 1869. doi: 10.3791/1869.

Hi-C: a method to study the three-dimensional architecture of genomes.

van Berkum NL1, Lieberman-Aiden E, Williams L, Imakaev M, Gnirke A, Mirny LA, Dekker J, Lander ES.

Author information


The three-dimensional folding of chromosomes compartmentalizes the genome and and can bring distant functional elements, such as promoters and enhancers, into close spatial proximity (2-6). Deciphering the relationship between chromosome organization and genome activity will aid in understanding genomic processes, like transcription and replication. However, little is known about how chromosomes fold. Microscopy is unable to distinguish large numbers of loci simultaneously or at high resolution. To date, the detection of chromosomal interactions using chromosome conformation capture (3C) and its subsequent adaptations required the choice of a set of target loci, making genome-wide studies impossible (7-10). We developed Hi-C, an extension of 3C that is capable of identifying long range interactions in an unbiased, genome-wide fashion. In Hi-C, cells are fixed with formaldehyde, causing interacting loci to be bound to one another by means of covalent DNA-protein cross-links. When the DNA is subsequently fragmented with a restriction enzyme, these loci remain linked. A biotinylated residue is incorporated as the 5' overhangs are filled in. Next, blunt-end ligation is performed under dilute conditions that favor ligation events between cross-linked DNA fragments. This results in a genome-wide library of ligation products, corresponding to pairs of fragments that were originally in close proximity to each other in the nucleus. Each ligation product is marked with biotin at the site of the junction. The library is sheared, and the junctions are pulled-down with streptavidin beads. The purified junctions can subsequently be analyzed using a high-throughput sequencer, resulting in a catalog of interacting fragments. Direct analysis of the resulting contact matrix reveals numerous features of genomic organization, such as the presence of chromosome territories and the preferential association of small gene-rich chromosomes. Correlation analysis can be applied to the contact matrix, demonstrating that the human genome is segregated into two compartments: a less densely packed compartment containing open, accessible, and active chromatin and a more dense compartment containing closed, inaccessible, and inactive chromatin regions. Finally, ensemble analysis of the contact matrix, coupled with theoretical derivations and computational simulations, revealed that at the megabase scale Hi-C reveals features consistent with a fractal globule conformation.

Getting Cancer Wrong

By Alexander Nazaryan / March 20, 2014 1:23 PM EDT

[Newsweek cover, March 28, 2014]

[I am sorry, I could not resist inserting at this time-point a symbolic diagram of my fractal-chaotic approach to genome (mis)regulation (2012), priority date 2002 Aug. 1., the reversal of junk/central dogmas (2008, 2008 Google Tech Talk YouTube) the concept of fractal iterative recursion reaching back to my book chapter 1989 - Dr. Pellionisz]

From his fourth-floor window at Tampa's Moffitt Cancer Center, Robert A. Gatenby can look down to where patients stand waiting for valets to retrieve their cars. They have gone through chemotherapy, biopsies, radiation. They are pale, anxious, resolute. Some will live and some will die: a young woman with short hair, clutching her partner's hand; an older man, alone. Students from the nearby University of South Florida pop out of patients' cars. Peppy and dressed in blue vests, these cheerful valets look as if they could be working at a luxury hotel in the tropics. But nobody here is on vacation.

Gatenby says he sometimes sees patients retching after chemotherapy, which reminds the 62-year-old radiologist that his Integrated Mathematical Oncology Department—the only full-scale outfit of its kind in the nation—does not have the luxury of time. Mathematics is not generally known for urgency. Few lives hinge on proof of the twin prime conjecture, but the mathematicians and oncologists Gatenby has assembled in Tampa are trying to tame the chaos of cancer in part through the same differential equations that have tortured so many generations of calculus students. By mathematically modeling cancer, they hope to solve it, to make its movements as predictable as those of a hurricane. The patients down there, fresh from treatment, need shelter from the storm.

Gatenby's small corner of Moffitt bears little resemblance to a medical center: There are no white-coated doctors frantically rushing to save patients or synthesizing miracle cures deep into the night. You might think you've found yourself in a sleepy academic department where abstract ideas are kicked around like a soccer ball on the college green. Which, come to think of it, is actually a pretty accurate description of what goes on in Gatenby's lab, though not at all a pejorative one. The mathematicians in his employ are convinced that we do not really understand cancer and that, until we do, our finest efforts will be tantamount to swinging swords in utter darkness. As far as these Tampa iconoclasts are concerned, your average cancer doctor is trying to build a jetliner without having grasped aerodynamics: Say, how many wings should we slap on this thing?

A Malicious Green Cloud

We have been fighting the War on Cancer since 1971, when President Richard M. Nixon declared that the "time has come in America when the same kind of concentrated effort that split the atom and took man to the moon should be turned toward conquering this dread disease." Four decades later, 1,665,540 Americans per year hear the dreaded diagnosis, and about 585,720 die annually from some variety of the disease, according to the American Cancer Society. Smallpox and polio have been cured or largely eradicated, but cancer remains the same scourge it was 4,500 years ago, when the Egyptian doctor Imhotep mused, in what may have been civilization's first stab at oncology, about how to treat "bulging masses on [the] breast." Modern oncology makes incremental advances, with a melanoma drug that extends survival by three months passing for a major breakthrough. This is nobody's fault, but everybody's problem.

Gatenby is tired of a fight we keep losing. After 30 years, he has come to the uneasy conclusion that cancer is smarter than we are, and will find ways to evade our finest medical weaponry. The weary warrior wants to make peace with cancer's insurgent cells—though on his own terms, terms that would spare the lives of many more patients. To some within the medical establishment, this might seem preposterous, but Gatenby relishes the role of the outsider.

Gatenby grew up in the Rust Belt town of Erie, Pa., where 12 years of Catholic school instilled in him "an incredible hatred of dogma." At Princeton University, he studied physics with some of the greatest scientific minds of the 20th century. Figuring he wasn't fated to join the physics pantheon, Gatenby turned to medicine. But medical school at the University of Pennsylvania was dismayingly similar "to the rote learning of catechism" he remembered from Saint Luke School. It felt like he was "going backwards."

Whether in the lab, the classroom or the clinic, Western medicine relies on cautious experimentation, its zeal for breakthroughs tempered by the Hippocratic injunction to do no harm. But that can foster a frustrating incrementalism that is itself injurious. David B. Agus, one of the nation's most prominent oncologists and a professor at the University of Southern California, explains that "you are not rewarded, in general, for taking risk. It's very scary to do something radically new."

Gatenby specialized in radiology and, after receiving his medical degree in 1977 and completing a residency, went to work in 1981 for the Fox Chase Cancer Center in Philadelphia. Fox Chase is to cancer research what the Boston Garden was to professional basketball. It was home to David A. Hungerford, one of two researchers responsible for discovering the Philadelphia Chromosome, a major clue to cancer's birth within the human genome. Among its current éminences grises is Alfred G. Knudson Jr., whose "two-hit" hypothesis holds that cancer is triggered by an unfortunate accumulation of errant genes, harmful outside events (too much sun, too much red meat) or a combination of the two.

The study of genes did not interest Gatenby back then, nor does it interest him now, even though much of medicine is now in the thrall of genomics. Gatenby wanted to discover cancer's "first principles," [Why not consider The Principle of Recursive Genome Function as a reasonable hypothesis? - AJP] the basic ideas behind the seemingly sudden explosion of cells that want to kill the very body that nourishes them. Sure, you could know the BRCA1 gene better than you know your own mother, but unless you had some insight into why it caused a furiously impervious breast cancer, you were trying to find your way out of a forest by studying the bark of a single tree. Gatenby sought to understand cancer with the same totality that Newton had understood gravity.

As with Newton's famous laws of motion, mathematics seemed to hold the key. Math had been used to model the weather and financial markets, which like the human body are fickle and incredibly sensitive to outside forces (a run on Greek banks; a low-pressure system moving down from Canada). Gatenby saw no reason the same could not hold true for cancer. He spent a year reading math, which puzzled his colleagues. Then, while visiting the Cloisters museum in upper Manhattan with his family, he took a sheet of stationery and started scratching down equations he thought could get him closer to cancer's fundamental truths.

"To say they hated it would not do justice," Gatenby says of the response of his Fox Chase colleagues. Other oncologists told him that "math modeling is for people too lazy to do the experiment" and that "cancer is too complicated to model." The latter is a refrain that, 30 years later, still dogs Gatenby and his staff at the Integrated Mathematical Oncology Department, which includes five mathematicians with no formal experience in medicine.

Among those five is Sandy Anderson, a young Scotsman who dresses as if he were on the way to a Beck concert. There is a bottle of single malt on his desk. "Of course cancer is complex," Anderson tells me, brogue rising. "But how can you say it's too complex? That complexity should be viewed as a challenge that we have to try and tackle. And just because there's complexity doesn't mean there aren't simple rules underlying it.

"What we'd love to do is have everybody's own little hurricane model for their cancer," he explains. This is less a metaphor than you may imagine. Anderson shows me computer models of a breast cancer's growth, the cells spreading like a malicious green cloud across the screen. Different versions of the model show what happens when different treatments are applied: Sometimes the cancer slows, but sometimes it explodes. This seems like an intuitively rational approach to the disease, predicting how it responds to a variety of treatments. But it isn't common. There are about a dozen drugs for breast cancer approved by the Food and Drug Administration. Depending on which form of the disease is diagnosed and at what stage it's discovered, there's a maddening number of viable drug combinations. Best practices exist, but these can be anecdotal, doctors simply doing what they think works. The War on Cancer is fought by competing bands with their own weapons, cancer's chaos exacerbated by our own dismaying disorder. Anderson would like to provide the onco-soldiers with battlefield maps.

Wrong But Useful

Weather often came up during my time in Tampa, and not only because a wet dreariness lingered in the Florida skies. In 1961, Edward N. Lorenz of the Massachusetts Institute of Technology tried to create computer models for weather, only to stumble into the field of chaos theory. He saw that weather was entirely dependent on initial conditions, so that if he altered his inputs by even a fraction of a percentage, the weather model would fluctuate to an unexpected degree, in unexpected directions. Yet patterns did emerge. This would come to be called "deterministic chaos," for the way complex adaptive systems—the weather, the global economy, maybe cancer—can both hew to our expectations and routinely subvert them. Sometimes, autumn acts like autumn. But once in a while, in Lorenz's famous formulation, a butterfly causes a hurricane.

One refrain I heard several times at Moffitt was that "all models are wrong, but some are useful," a quip by the late mathematician George E.P. Box. A mouse injected with melanoma is only an imperfect model of human cancer; if it weren't, you wouldn't be reading this article today, for, as Anderson acidly notes, "We've solved cancer in mice a hundred thousand times." This is a model, too:

[here there is a simple equation, but as we know the "maddening complexity" of the Mandelbrot Set is just Z=Z^+C. Computers of IBM, Samsung, Sony, BGI, Google, Apple (etc) do and will only understand algorithms, and simply ignore "anecdotal lamenting on too much complexity" - AJP]

If that freaks you out, don't worry—it freaks out a lot of clinicians. Gatenby and his team are doing the math for them, convinced that their models of cancer strike the right balance between specificity and universality.

The other option is to keep chasing errant genes and trying to snuff them out, but that seems to many like a futile enterprise, sort of like trying to plug a leaking dam with wads of cotton. A tumor that weighs just 10 grams, Gatenby says, contains more cells than there are humans on earth. Nor are those cells a uniform gray mass, as the popular conception of cancer has it. As the tumor grows, different mutations may come to the fore, sort of the way a military assault may deploy infantry and artillery at different times in an attack. The cells of a single cancer differ within a single patient, and the same types of cancers differ from patient to patient. Talking about a prototypical cancer, then, is about as helpful as talking about a prototypical dog.

"It's almost like it's an intelligent opponent," says Donald A. Berry, who heads the biostatistics department at the M.D. Anderson Cancer Center in Houston. "It has many, many paths that it can take." Mathematics, Berry says, can "provide answers where biology runs into a wall."

One of those walls is the sheer amount of information cancer researchers would need to map every possible genetic mutation possible for the 200 cancers that can ravage the human body. Researchers have spent $375 million to create the Cancer Genome Atlas, which is based on the screening of 10,000 cancer samples for the responsible genes. Some think that until we've sorted through about 100,000 samples, the cancer gene compendium will be woefully incomplete. "It would be crazy not to have the information," the geneticist Eric S. Lander told The New York Times.

But information brings its own delusions. Gatenby laments the "vast industry that's developed over molecular data." He is frustrated by the narrow focus of many of his colleagues. The bookshelves in his office don't hold the standard medical tomes; they are instead lined with rare physics texts from the early 20th century, including several volumes of the Annalen der Physik, which published the pioneering work of Einstein, Hertz and Planck. Nestled among these is a copy of Everyone Poops—Gatenby recently became a grandfather.

The most curious (and most telling) book on Gatenby's shelf is The Truth in Small Doses: Why We're Losing the War on Cancer-and How to Win It. Having survived Hodgkin's lymphoma as a young man, Clifton Leaf decided to investigate why cancer medicine had made so few advances in recent years. The resulting Fortune article in 2004, as well as his book last year, offers little cause for optimism, with their depiction of a medical culture whose caution has gradually ossified into maddening inertia.

"I like big thinkers," Leaf told me when I asked him about Gatenby's work. "He's a guy who doesn't get stuck in orthodoxy." As Leaf writes in his book, the whack-a-gene approach to cancer has its finest success story in Gleevec, which, since its introduction about a decade ago, has proved an adept warrior against chronic myelogenous leukemia. It does exactly what proponents of the genetic approach to cancer hope, seeking out the tyrosine kinase enzymes that drive the growth of the once-deadly blood cancer. Time magazine put Gleevec on its cover in 2001, asking, "This little pill targets cancer cells with uncanny precision. Is it the breakthrough we've been waiting for?"

Unlike most solid-tumor cancers, the type of leukemia Gleevec cured is spurred by a single "driver" mutation. Render it inert, as Gleevec does, and the cancer is largely defeated. But few other iterations of the disease are so simple. The closest successor drug is Herceptin, which targets breast cancer tumors with the HER2-positive genetic alteration. Otherwise, targeted therapy has not fulfilled its promise; chasing after errant genes through the body is like trying to catch a school of tuna with a Ziploc bag.

And so Gatenby and his team have taken the opposite tack, going way big and trying to understand all of cancer, instead of just one or two genes. Gatenby told me a story about a cancer conference in Toronto where all attendees were introduced by the name and the molecule they were studying. He chuckles at this myopia, divorced from any greater vision of the Brobdingnagian disease.

'A Blind, Emotionless Alien'

Gatenby knows that all his equations will mean nothing if they don't help patients. Ultimately, he will have to convince the very clinicians who frustrate him that his abstractions can have real-world benefits. He needs to not only predict the hurricane but also save the cities in its path.

In 2000, Gatenby went to the University of Arizona and was named the head of radiology at its College of Medicine in 2005. It was here in the desert of Tucson that he had an intellectual conversion. He had been publishing mathematical models of cancer during the past two decades at Fox Chase, but now he began to understand the role that evolution plays in carcinogenesis. The first principles of cancer that he had been trying to find, Gatenby surmised, lay in the Darwinian concept of natural selection.

Gatenby's insight was brilliantly counterintuitive: Cancer is really, really good at evolution. So damn good that our bodies nourish it, even as it hijacks blood vessels and nutrients. It fools the immune system, nestling so deep within normal tissue that we can't easily extract it. And then, in what amounts to suicide, it kills the very body in which it has taken root. The writer Christopher Hitchens once described the esophageal cancer that would soon kill him as a "blind, emotionless alien." But that alien is actually a native son.

Math could provide a map of cancer's movements; Gatenby now understood that only Darwin could explain why that movement was so hard to arrest. We were unwittingly helping that evolution along, turning all too many cancers into hurricanes. Worst of all, we were doing it in the name of saving lives.

Gatenby was convinced by studying pest management, of all disciplines. By the early 1970s, the agricultural industry had come to realize the limit of synthetic pesticides, which had been famously demonized by Rachel Carson's Silent Spring in 1962. They were not only potentially harmful to humans but possibly not all that good at protecting crops. If used indiscriminately, chemical agents would indeed kill plenty of insects, but those that survived had a resistance to the toxic substance, which could do nothing against the remaining pests, which were free to breed. This was the evolutionary version of the cliché about how the thing that doesn't kill you makes you stronger. Our efforts to vanquish the bugs forced evolution's hand.

A little more than a month after signing the National Cancer Act on December 23, 1971, Nixon addressed Congress on the nation's environmental challenges. Among the initiatives he introduced was Integrated Pest Management, which promised "judicious use of selective chemical pesticides in combination with nonchemical agents and methods," like the deployment of natural predators. Instead of trying to kill all bugs, Integrated Pest Management would try to control the population, less concerned with annihilation than watchful containment. There would always be some bugs; the goal was to keep them from spreading, often by using less than the maximum dosage of pesticide.

The War on Cancer sold the American public (and much of the medical establishment) on the idea that cancers must be vanquished entirely, that no stalemate was possible. Thus the endless rounds of chemo a patient faces today, killing good cells with the bad. Gatenby thought that Integrated Pest Management offered a rejoinder to that all-or-nothing mentality. The bug guys had realized what the cancer guys hadn't: You raze an entire enemy city, and those who remain will be hardened insurgents with a lust to kill. But trim away strategically at the enemy's forces, and the rest of the population will be kept at bay. Treatment, in other words, may aggravate a cancer's growth by stripping away the easy-to-kill cells and leaving behind hardened carcinogenic warriors.

"Evolution will win this game," Gatenby tells me. Malignant cells will eventually evolve beyond the capacity of any drug to hold them at bay. Even the patients of wonder drug Gleevec face an eventual recrudescence of chronic myelogenous leukemia. The question is whether a mathematical understanding of how cancer progresses can lead to treatments that are less prone to aggravating resistant cells into proliferation.

Gatenby came to Moffitt in 2008 to fix the radiology department. Though he continues to do clinical rounds one day each week, his intellectual energies have clearly shifted to the Integrated Mathematical Oncology Department, which has six full-time members and about a dozen postdoctoral and graduate students. They know that many think they are on a quixotic quest. They know that the money is in drug development. They know that too many people are dying. And yet there they are, jovially scrawling equations on a blackboard.

Ravaged by Rambo

So how do you defeat a disease that is constantly evolving a resistance to our weapons?

Maybe by turning your swords into plowshares. In 2009, Gatenby published a paper in Cancer Research called "Adaptive Therapy." Gatenby posited that a tumor consisted, in essence, of chemo-sensitive cells, fast to proliferate, and chemo-resistant cells, more reluctant to grow. The standard practice of giving the maximum tolerated dose of chemo would clear out the sensitive cells, leaving behind a tough nugget of impervious cells, the al-Qaida rebels of the bunch. These previously dormant cells would now pour out of their caves, suddenly finding both space and nourishment to grow. And grow they would, with the barrier of the sensitive cells gone. In essence, cancer therapy was killing "good" cancer cells while leaving behind "bad" ones.

Gatenby used both mathematical models and in vivo experiments to show that adaptive therapy, which kept some of cancer's petty criminals around, actually held the most dangerous cells in abeyance. One of his co-authors on the 2009 paper was Ariosto S. Silva, a mathematician now at Moffitt. Silva, who is Brazilian, likes to explain cancer through the Rambo film First Blood, about a bloodthirsty Sylvester Stallone protagonist who is made an especially vicious killer because of the suffering he endured in Vietnam. Our approach to chemotherapy is turning cancer cells into Rambos, Silva explained when I met him in Tampa.

A recent innovation is the use of ersatzdroges, or "fake drugs," which target chemo-resistant cells without killing them. In a draft paper titled "Sweat but No Gain," Gatenby and his Moffitt colleagues describe how multi-drug-resistant cells "continue to activate their membrane pumps to extrude the ersatzdroge as though it were a cytotoxic agent," that is, an actual chemotherapeutic drug. But it isn't. The cells don't know that, however, working furiously to defend themselves, "thus causing a decrease in fitness by limiting available resources for proliferation and invasion." While research into ersatzdroges is relatively new, Gatenby's paper suggests that tiring cells out instead of killing them does slow tumor growth. Instead of evolving with their resistance, the cells expend all their energy staying afloat.

Some oncologists are skeptical of Gatenby's approach. Robert Weinberg, the MIT cancer researcher who discovered the first cancer-causing gene, does not believe mathematical oncology is a fruitful pathway because it "lacks predictive powers that extend beyond predictions made from simple, intuitive assessments of future behavior," as he told me in an email. Others say that while Gatenby's evolutionary portrait of cancer is a clever analogy, it is not instructive for treatment of the disease. Marc B. Garnick, a urologic cancer specialist at the Beth Israel Deaconess Medical Center and Harvard Medical School, says Gatenby's ideas mirror the practice of metronomic therapy, which "has been around for decades." (Gatenby disputes this claim.)

But for clinicians frustrated with the current pace of progress in the War on Cancer, Gatenby at the very least offers a new way of thinking about a disease that has perplexed humanity for thousands of years.

Athena Atkipis, an evolutionary biologist and co-founder of the Center for Evolution and Cancer at the University of California, San Francisco (and a sometime colleague of Gatenby's), likes to think of the different approaches to the War on Cancer in terms of Greek mythology. "My name is Athena, I know," she says by way of disclaimer before launching into an explanation: The god of war, Ares, understandably "likes a good fight." Athena, meanwhile, is a master strategian, always scheming about how to outwit the enemy. Her namesake would know better, Atkipis implies, than trying to "intimidate the cancer into retreating."

In Tampa, Gatenby and I went to dinner at a Cajun restaurant with Anderson and Silva, two of Moffitt's mathematicians. Downing his first drink, Anderson indicated his distaste for most of what passes for oncology today: "We keep measuring s**t without getting anything out of it." Everyone laughed. Gatenby complained about how the first fancy car he'd ever purchased had had its cool compromised by the installation of a car seat. Everyone laughed again. It was like the scene in The Untouchables when Eliot Ness, the famed Prohibition enforcer played by Kevin Costner, takes his lawmen out for a celebratory dinner after a huge cache of booze has been found and summarily destroyed. They are drunk with elation, but also a little anxious: The main target, Al Capone, remains at large. So it was in Chicago. So it was in Tampa, only with drinks.

"I've never seen a dogma I didn't hate," Gatenby told me. That was true at the Catholic school in Pennsylvania. And it is true today, at the hospital in Florida, where the cancer patients are waiting.

[Finally, we have a leading weekly (Newsweek) having made quite a turn-around! Of course, quite soon many will say (in retrospect...) that "we have never believed in the Central Junk Dogmas". It was "understandable" that prior to conclusion of ENCODE in 2007 when the US government came out with refutation of the dogma of Junk, our manuscript to Science, authored by 25 experts worldwide, went "unpublished without review" (2006). It was less understandable that the Old Newsweek in 2007 totally censored out a brilliant article by Princeton's Prof. Lee Silver from its USA/Canada edition (that appeared in European, South American and Asian Editions). Now, just a fortnight after Newsweek's eye-opener article (Hacking Your DNA, see in this column), the USA appears to wake up to the biggest paradigm-shift in science/technology/health-care/defense (of HoloGenomics) - ever! Not entirely surprisingly, the US was also a bit tardy with the paradigm-shift with nuclear science/technology, until (the Hungarian Dr. Szilard, driven by Teller to Einstein) alerted the establishment to get going "American Industry Style". As the two articles below show, both China and through an award-winning lecture-tour by AJP to India, the Asian world-powers, along with Samsung of Korea and Sony of Japan, are actually ahead at this time - though the US expresses its interest by dishing out encouraging prizes; most recently the Hertz Fellowship to Erez Lieberman-Aiden, for his pioneering the fractal genome in 2009. - Dr. Pellionisz]

Fractal geometry to help diagnose cancer

The Times of India
TNN | Feb 20, 2014, 03.12AM IST

NAGPUR: Human organs cannot always be represented through geometrical shapes. That is why several scientists have felt the need to widen the scope of science to include non-linear mathematics. A team of oral pathologists from city's VSPM Dental College and Research Centre have come up with a solution for this in a study about utilizing fractal geometry that has the ability to quantify irregular and complex objects by finding symmetry in them.

Though the study is still in its initial phase, it has already been presented at King George Medical University of Lucknow where it bagged the first prize in the poster competition at International Oral Pre-Cancer and Cancer Congress 2014. If utilized properly, fractal geometry can be a helpful diagnostic tool. By establishing a standard or normal range of fractal dimension, this measure can be used for diagnosis of oral cancer. Fractals are objects formed from sub-parts that resemble the whole object exactly or statistically.

[Pellionisz received an Award from India for his Fractal Approach to Genomics, presented in a Guest of Honor Lecture Tour - 2012]

Designing Fractal Nanostructured Biointerfaces for Biomedical Applications

[See full .pdf of Chinese Fractals, 2014]

Pengchao Zhang[a, b] and Shutao Wang*[a]

[a] Dr. P. Zhang, Prof. S. Wang

Beijing National Laboratory for Molecular Sciences (BNLMS)

Key Laboratory of Organic Solids, Institute of Chemistry

Chinese Academy of Sciences (ICCAS), Beijing, 100190 (P.R. China)

Fax: (+86) 010-82627566


[b] Dr. P. Zhang

University of Chinese Academy of Sciences

Beijing 100049 (P.R. China)

Conclusions and Outlooks

Fractal structures can provide a unique “fractal contact mode”

that ensures an outward, 3D, and spatial contact for biomarkers,

a reduced diffusion length for small biomarkers to the surface

of the sensor, and enhanced topographic interaction by

matching the fractal dimensions of the tumor cells, which

makes biomarkers easily accessible to probes and consequently

promotes detective efficiency and sensitivity. Fractal nanostructured

biointerfaces have been proven to be effective for

the ultrasensitive detection of various biomarkers with extremely

low abundances in clinical samples, and this paves the

way for the further exploitation of these nanostructures in biomedical

applications. Moreover, the successful detection of disease-

relevant biomarkers, such miRNA, CA-125, and MCF7 cells,

from unpurified cell lysates and from the blood of patients

makes these nanostructures promising for practical clinical use.


We thank the National Research Fund for Fundamental Key Projects

(2012CB933800), the National Natural Science Foundation

(21121001, 21127025, 21175140, and 20974113), the Key Research

Program of the Chinese Academy of Sciences (KJZD-EW-M01), the

National High Technology Research and Development Program

of China (863 Program) (2013AA031903), and the National Instrumentation

Program (NIP) (2013YQ190467) for financial support.

The New York Genome Center and IBM Watson Group Announce Collaboration to Advance Genomic Medicine

IBM Selected as First Technology Partner for Leading Genomic Research Institution; Project aims to Apply Advanced Analytics to Genomic Treatment Options for Brain Cancer Patients

IBM-NYC GENOME CENTER "HoloGenome" effort (0:30 sec)

NEW YORK, N.Y. - 19 Mar 2014: The New York Genome Center (NYGC) and IBM (NYSE: IBM) today announced an initiative to accelerate a new era of genomic medicine with the use of IBM’s Watson cognitive system. IBM and NYGC will test a unique Watson prototype designed specifically for genomic research as a tool to help oncologists deliver more personalized care to cancer patients.

NYGC and its medical partner institutions plan to initially evaluate Watson’s ability to help oncologists develop more personalized care to patients with glioblastoma, an aggressive and malignant brain cancer that kills more than 13,000 people in the U.S. each year. Despite groundbreaking discoveries into the genetic drivers of cancers like glioblastoma, few patients benefit from personalized treatment that is tailored to their individual cancer mutations. Clinicians lack the tools and time required to bring DNA-based treatment options to their patients and to do so, they must correlate data from genome sequencing to reams of medical journals, new studies and clinical records -- at a time when medical information is doubling every five years.

This joint NYGC Watson initiative aims to speed up this complex process, identifying patterns in genome sequencing and medical data to unlock insights that will help clinicians bring the promise of genomic medicine to their patients. The combination of NYGC’s genomic and clinical expertise coupled with the power of IBM’s Watson system will enable further development and refinement of the Watson tool with the shared goal of helping medical professionals develop personalized cancer care.

The new cloud-based Watson system will be designed to analyze genetic data along with comprehensive biomedical literature and drug databases. Watson can continually ‘learn’ as it encounters new patient scenarios, and as more information becomes available through new medical research, journal articles and clinical studies. Given the depth and speed of Watson’s ability to review massive databases, the goal of the collaboration is to increase the number of patients who have access to care options tailored to their disease’s DNA.

“Since the human genome was first mapped more than a decade ago, we’ve made tremendous progress in understanding the genetic drivers of disease. The real challenge before us is how to make sense of massive quantities of genetic data and translate that information into better treatments for patients,” said Robert Darnell, M.D., Ph.D., CEO, President and Scientific Director of the New York Genome Center. “Applying the cognitive computing power of Watson is going to revolutionize genomics and accelerate the opportunity to improve outcomes for patients with deadly diseases by providing personalized treatment.”

First Watson Application in Genomic Research

Watson will complement rapid genome sequencing and is expected to dramatically reduce the time it takes to correlate an individual’s genetic mutations with reams of medical literature, study findings, and therapeutic indications that may be relevant. The intention is to provide comprehensive information to enable clinicians to consider a variety of treatment options that the clinician can tailor to their patient’s genetic mutations. It will also help NYGC scientists understand the data detailing gene sequence variations between normal and cancerous biopsies of brain tumors.

“As genomic research progresses and information becomes more available, we aim to make the process of analysis much more practical and accessible through cloud-based, cognitive innovations like Watson,” said Dr. John E. Kelly III, Senior Vice President and Director of IBM Research. “With this knowledge, doctors will be able to attack cancer and other devastating diseases with treatments that are tailored to the patient’s and disease’s own DNA profiles. If successful, this will be a major transformation that will help improve the lives of millions of patients around the world.”

The goal is to have the Watson genomics prototype assist clinicians in providing personalized genomic analytics information as part of a NYGC clinical research study. The solution has been under development for the past decade in IBM’s Computational Biology Center at IBM Research.

New York State’s Investment in Genomic Medicine

New York State is at the forefront of advancing medical science and commercialization. Governor Andrew M. Cuomo recently proposed $105 million to fund a partnership between NYGC and the University at Buffalo’s Center for Computational Research to advance genomics research. This investment to enhance the state’s genomic medicine capabilities, together with NYGC’s acquisition of Illumina’s state-of-the-art HiSeq X Ten whole human genome sequencing system, will accelerate the availability of valuable genomic information in New York.

“New York State’s investment in cutting-edge innovative industries is creating jobs and growing the economy in Western New York and across our state,” said Governor Cuomo. “This collaboration between the New York Genome Center and IBM will help make the region a new hub for the growing bio-tech industry.”

IBM is NYGC’s Founding Technology Member and will advance the organization’s goals of translating genomic research into clinical solutions for serious disease through the collaboration of medicine, science and technology. As biology increasingly becomes an information science, the promise of genomics is closer to reality with the help of data-driven analytics methods and more powerful computing systems. IBM and NYGC’s computational biology experts are renowned for accelerating life sciences discoveries using deep analytical approaches and next generation information technologies.

Learn more about this story at

To view a Flickr image gallery that illustrates today’s news please click here.

For additional perspectives on this story, please watch this video.

To join the social conversation on Twitter use the hashtag #NYGCWatson.

Journalists and bloggers can download broadcast video, b-roll and photos about the Watson and New York Genome Center collaboration at The video is available in HD, standard definition broadcast and streaming quality.

About the New York Genome Center

The New York Genome Center (NYGC) is an independent, nonprofit at the forefront of transforming biomedical research and clinical care with the mission of saving lives. As a consortium of renowned academic, medical and industry leaders across the globe, NYGC focuses on translating genomic research into clinical solutions for serious disease. Our member organizations and partners are united in this unprecedented collaboration of technology, science, and medicine. We harness the power of innovation and discoveries to improve people’s lives - ethically, equitably, and urgently. Member institutions include: Albert Einstein College of Medicine, American Museum of Natural History, Cold Spring Harbor Laboratory, Columbia University, Cornell University/Weill Cornell Medical College, Hospital for Special Surgery, The Jackson Laboratory, Memorial Sloan-Kettering Cancer Center, Icahn School of Medicine at Mount Sinai, New York-Presbyterian Hospital, The New York Stem Cell Foundation, New York University, North Shore-LIJ, The Rockefeller University, Roswell Park Cancer Institute and Stony Brook University. For more information, visit:

About IBM Watson

Named after IBM founder Thomas J. Watson, Watson was developed in IBM’s Research labs and is now being accelerated into market by the new Watson Group. Watson represents a new class of software, services and apps that think, improve by learning, and discover answers and insights to complex questions from massive amounts of Big Data. Watson’s ability to answer complex questions posed in natural language with speed, accuracy and confidence is transforming decision-making across a variety of industries, including health care, financial services and retail. IBM has advanced Watson from a game-playing innovation into a commercial technology. Using natural language processing and analytics, Watson processes information akin to how people think, representing a major shift in an organization’s ability to quickly analyze, understand and respond to Big Data. Now delivered from the cloud and able to power new consumer and enterprise services and apps, Watson is 24 times faster, smarter with a 2,400 percent improvement in performance, and 90 percent smaller – IBM has shrunk Watson from the size of a master bedroom to three stacked pizza boxes. IBM is investing $1 billion to introduce a new class of cognitive computing services, software and apps, and investing $100 million to spur innovation for software application providers to develop a new generation of Watson-powered solutions. Learn more about IBM Watson at Learn more about IBM Research at

Learn more about IBM healthcare at

[IBM just about completes the list of Samsung, Sony/Olympus/Illumina, Google, Google & Apple (Calico, Inc.), BGI/Complete Genomics and other IT Giants - Dr. Pellionisz]

A novel mechanism for fast regulation of gene expression


March 18, 2014

Our genome, we are taught, operates by sending instructions for the manufacture of proteins from DNA in the nucleus of the cell to the protein-synthesizing machinery in the cytoplasm. These instructions are conveyed by a type of molecule called messenger RNA (mRNA).

Francis Crick, co-discoverer of the structure of the DNA molecule, called the one-way flow of information from DNA to mRNA to protein the "central dogma of molecular biology."

Yehuda Ben-Shahar and his team at Washington University in St. Louis have discovered that some mRNAs have a side job unrelated to making the protein they encode. They act as regulatory molecules as well, preventing other genes from making protein by marking their mRNA molecules for destruction.

"Our findings show that mRNAS, which are typically thought to act solely as the template for protein translation, can also serve as regulatory RNAs, independent of their protein-coding capacity," Ben-Shahar said. "They're not just messengers but also actors in their own right." The finding was published in the March 18 issue of the new open-access journal eLife.

Although Ben-Shahar's team, which included neuroscience graduate student Xingguo Zheng and collaborators Aaron DiAntonio and his graduate student Vera Valakh, was studying heat stress in fruit flies when they made this discovery, he suspects this regulatory mechanism is more general than that.

Many other mRNAs, including ones important to human health, will be found to be regulating the levels of proteins other than the ones they encode. Understanding mRNA regulation may provide new purchase on health problems that haven't yielded to approaches based on Crick's central dogma.

Is gene expression regulated directly?

Ben-Shahar's original objective was to better understand how organisms maintain their physiological balance when they are buffeted by changes in the environment.

Neuroscientists know that if you warm neurons in culture, he said, the neurons will fire more rapidly. And if the culture is cooled down, the neurons slow down. Neurons in an organism, however, behave differently from those in a dish. Usually the organism is able to cushion its nervous system from heat stress, at least within limits. But nobody knew how they did this.

As a fruit fly scientist, Ben-Shahar was aware that there are mutations in fruit flies that make them bad at buffering heat stress, and this provided a starting point for his research.

One of these genes is actually called seizure, because flies with a broken copy of this gene are particularly sensitive to heat. Raising the temperature even 10 degrees sends them into seizures. "They seize very fast, in seconds," Ben-Shahar said."When we looked at seizure (sei) we noticed that there is another gene on the opposite strand of the double-stranded DNA molecule called pickpocket 29 (ppk29)," Ben-Shahar said. This was interesting because seizure codes for a protein "gate" that lets potassium ions out of the neuron and pickpocket 29 codes for a gate that lets sodium ions into the neuron.

Neurons are "excitable" cells, he said, because they tightly control the gradients of potassium and sodium across their cell membranes. Rapid changes in these gradients cause a nerve to "fire," to stop firing, and to repolarize, so that it can fire again.The scientists soon showed that transcription of these genes is coordinated. When the flies are too hot, they make more transcripts of the sei gene and fewer of ppk29. And when the flies cooled down, the opposite happened. If the central dogma held in this case, the neurons might be buffering the effects of heat by altering the expression of these genes.

One problem with this idea, though, is that gene transcription is slow and the flies, remember, seize in seconds. Was this mechanism fast enough to keep up with sudden changes in the environment?

Does RNA interference regulate gene expression?

But the scientists had also noticed that the two genes overlapped a bit at their tips. The tips, called the 3' UTRs (untranslated regions), don't code for protein but are transcribed into mRNA.

That got them thinking. When the two genes were transcribed into mRNA, the two ends would complement one another like the hooks and loops of a Velcro fastener. Like the hooks and loops, they would want to stick together, forming a short section of double-stranded mRNA. And double-stranded mRNA, they knew, activates biochemical machinery that degrades any mRNA molecules with the same genetic sequence.

Double-stranded RNA binds to a protein complex called Dicer that cuts it into fragments. Another protein complex, RISC, binds these fragments. One of the RNA strands is eliminated but the other remains bound to the RISC complex. RISC and its attached RNA then becomes a heat-seeking missile that finds and destroys other copies of the same RNA. If the mRNA molecules disappear, their corresponding proteins are never made.

It turned out that heat sensitivity in the fly is all about potassium channel, said Ben-Shahar. What if, he thought, the two mRNAs stuck together, the mRNA segment encoding the potassium channel was bound to RISC and other copies of the potassium channel mRNA were destroyed. This was another, potentially faster way the neurons might be controlling the excitability of their membranes.

A designer fly provides an answer

Which is it? Is regulation occurring at the gene level or the mRNA level?

To find out, the scientists made designer fruit flies that had various combinations of the genes and their sticky noncoding ends. One of these transgenic fly lines was missing the part of the gene coding for the ppk29 protein but still made lots of mRNA copies of the sticky bit at the end of ppk29. When there were lots of these isolated sticky bits, sei mRNA levels dropped. This fly was as heat sensitive as a fly completely missing the sei gene.

This combination of genotype and phenotype held the answer to the regulatory problem. First of all, mRNA from one gene (ppk29) is regulating the mRNA of another gene (sei). And, second, the regulatory part of ppk29 is the untranslated bit at the end of the mRNA. When this bit sticks to a complete transcript of the sei gene (including, of course, its sticky bit), the RISC machinery destroys any copies of the sei mRNA it finds.

So the gene that codes for a sodium channel regulates the expression of the potassium channel gene. And it does so after the genes are transcribed into mRNA; it's mRNA-dependent regulation.

The interaction between sei and ppk29 is unlikely to be unique, Ben-Shahar said. The potassium channel is highly conserved among species, and analyses of the genome sequences in flies and in people show that two of three fly genes for this type of potassium channel and three of eight human genes for these channels have overlapping 3' UTR ends, just as do sei and ppk29.

Why does this regulatory mechanism exist? Ben-Shahar hates getting out in front of his data, but he points out that transcribing DNA into mRNA is a slower process than translating mRNA into protein. So it may be, he said, that neurons maintain a pool of mRNAs in readiness, and mRNA interference is a way to quickly knock down that pool to prevent the extra mRNA from being translated into proteins that might get the organism in trouble.

["Facts do not kill theories - only a more advanced theory can take place of an obsolote one" - said Einstein. Crick's rather ridiculous "Central Dogma" (named, since as he confessed, did not know what the word "Dogma" meant :-) has died of a million wounds of facts. Indeed, John Mattick has long labeled the Junk/Dogma interlocking fatal errors "the biggest mistake in the history of molecular biology". The question remains, therefore, "what do we have, as mathematically well-formulated theories that can (indeed, must) take place of the obsolete "old school" of DNA>RNA>PROTEIN (with information "never" recursing :-) Since the science paper apparently is yet to come, perhaps the authors will look into existing literature. As detailed in The Principle of Recursive Genome Function (2008) peer-reviewed paper and its popularization by 2008 Google Tech Talk YouTube , by reversing both mistaken axioms those companies that are set to deploy software-enabling algorithmic approaches to interpretation of genome function (most importantly, hologenome-regulation), will arrive at the conclusion that recursive algorithms, most particularly Pellionisz' fractal iterative recursion (FractoGene) are a leading candidate today. The "best methods" were revealed in the above publications. More recent "best methods" are indicated by interpretation of the RNA-system as a dual functor, that both "makes" proteins (by DNA of exons, through cRNA and Proteins), as well as "measures" already built proteins (by ncDNA transcription factors emanating ncRNA). Mathematically, this dual representation is the key to understanding, thus utilizing, hologenome regulation. - Dr. Pellionisz]

S. Korea announces $540 million post-genome project

2014/02/19 12:00

SEJONG, Feb. 19 (Yonhap) -- The South Korean government announced Wednesday the official launch of a post-genome project that seeks to develop and commercialize new genomic technologies.

The project consists of five major goals that include the creation of a standardized human genome map. It also aims for developing indigenous technologies for analysing human genes and for genome-based diagnosis and disease treatment.

The health and welfare ministry, the Rural Development Agency and four other government offices will spend some 578 billion won (US$542 million) over the next eight years, starting with 45.5 billion won in 2014 alone.

The latest endeavor follows the human genome project, an international research project concluded in 2003 to identify the sequence of chemical base pairs that make up human DNA and total genes of the human genome.

"The genomic industry has been showing the fastest growth in the 21st century, but the country's investment and technology in that area are far behind those in other industries," the Ministry of Science, ICT and Future Planning said.

"Now is time when the country urgently needs an active strategy to catch up with the rest of the world."

Genomic technologies have dramatically improved over the past 20 years, partly on the development of information technology, the ministry noted.

In 1990, decoding and analyzing a human genome required up to 15 years and US$3 billion. In 2013, the same task took only one day and $1,000, according to the ministry.

Still, the level of South Korea's genomic technologies remain at 57.7 percent of that of the United States.

The launch of the post-genome project follows a three-year feasibility study from scientific and economic perspective, the ministry said.

[South Korea may have 57.7 percent of the level of genomic technologies of the United States. However, the USA not only has a "zero dollar post-genome project", but never had (and is unlikely to have anytime soon) a "National Genome Program", for instance compared to NASA for Aeronautics and Space, or National Genome Programs of even rather tiny countries, for instance Estonia. As a Senior National Research Council Associate of the National Academy to NASA (Ames Research Center) with the (first) "launching of the Decade of Brain" I put together back in 1990 a study program how the underused and overspending government facilities could have played a lead-role at that time in the Neurosciences, and one could easily outline in the present global competition how the US government could measure up, e.g. not to slide to become second after China. In 2006, twenty-five of the leading scientists of the International Hologenomics Society worldwide (yes, including one from South Korea...) submitted a manuscript to Science to launch a "Post-Genetics Program" (unpublished). In 2014 genome industry is globalized, with players like Samsung, Sony/Illumina, BGI/Complete Genomics, Google/Apple (Calico), GE Health, Siemens, etc., etc., with different government-subsidy in each global player. Personally, I believe that Intellectual Property will play an increasingly crucial role; perhaps even small part with patents, but an ever-growing part with "know-how" and "trade secrets" - Dr. Pellionisz]

Hacking Your DNA


By David Ewing Duncan / March 12, 2014 12:58 PM EDT

Keeping track of what we reveal about ourselves each day—through email and text messages, Amazon purchases and Facebook "likes"—is hard enough.

Imagine a future when Big Data has access not only to your shopping habits, but also to your DNA and other deeply personal data collected about our bodies and behavior—and about the inner workings of our proteins and cells. What will the government and others do with that data? And will we be unaware of how it's being used—or abused—until a future Edward Snowden emerges to tell us?

Consider this scenario: A few years from now the National Security Agency hires a young analyst trained in cyber-genetics. She is assigned to comb through millions of DNA profiles in search of markers that might identify terrorists and spies and other persons of interest. It's simple enough, since almost every American and billions of other people have deposited their complete genomes—every A, C, T and G in their cells—into one of the huge new digital health networks, the new Googles and Verizons of medical data.

Sequencing a person's entire DNA profile will be as cheap as getting a car wash. High-end automobiles and hotels are likely to have installed photonic (light) sensors—devices that quickly read small segments of DNA in a customer's skin cells to confirm their identity—to unlock doors. Banks may offer DNA-secure accounts that can only be accessed by a person with the correct genetic code.

People in this future world will be accustomed to genetics guiding treatments and saving lives, even as they remain uneasy about who exactly has access—Employers? Insurers? The government? Their spouse or lover?

With her top-secret clearance, the NSA's new analyst discovers that the agency has accessed the genetic records of not only suspected terrorists, but also heads of state and leaders in industry, academia, the arts and the news media. Troubled by what she has learned, the analyst announces that she's taking a vacation, and flies to a neutral country carrying top-secret cyber-genetic documents stored on an encrypted nanochip. Like Edward Snowden, she gives her data to a reporter, with the hope of rectifying the injustices she has witnessed.

For better or worse, we're not there yet. In 2014, neither the government nor the public sector are anywhere near having a World Wide Web for genetic and other personal molecular data, or a global wireless network that can access anyone's genetic data from anywhere. If this were the Internet, the technology would be in about 1985—at the very beginning.

Physicians, however, are already using genomics to predict and diagnose diseases such as breast cancer and macular degeneration. Thousands of parents use prenatal genetic tests to check if their embryo or fetus carries genes for devastating diseases such as Tay-Sachs or Fragile X syndrome. Researchers have discovered genetic markers that can identify mutations in cancerous tumors that allow doctors to target specific chemotherapy drugs to match a patient's mutations in their own DNA—leading, in some cases, to astonishingly high rates of remission.

In the past two decades, the drug industry and government agencies like the National Institutes of Health have plowed hundreds of billions of dollars into turning genetics from a research project into something real. AT&T

, Verizon, IBM and other IT giants are developing digital health networks and products, while thousands of start-ups are in a mini-frenzy to create new digital health networks and apps.

Some companies, including Google-backed 23andme, have begun to provide customers with access to their own genetic data. (23andme actually stopped providing customers with genetic health data after being warned by the FDA that they need approval for some of these tests—the company says that they are working to fix this). Labs and companies are also in the very early stages of developing devices that read short DNA sequences using light waves, or a simple pinprick of blood.

In January, San Diego-based Illumina, a gene-sequencing company, announced that it can now sequence an entire genome for only $1,000. This may sound pricey, but just a decade ago a single human genome cost hundreds of millions of dollars to sequence. The price is likely to get even less expensive in future years.

This year, the number of people having their genomes sequenced could top 50,000, and that number should increase exponentially over the next few years as governments and health-care systems announce projects to sequence hundreds of thousands of people. Last year the U.K. announced plans to sequence 100,000 citizens by 2017. In the U.S., Kaiser Permanente has teamed up with the University of California at San Francisco to sequence 100,000 patients.

Eventually the mountains of data generated by our DNA and digital health records will be linked to Facebook and Twitter pages (or the future equivalent), and to those pink suede shoes you just bought and shared on the latest incarnation of Instagram. We may not like it, but the reality is that we give up this type of information to these companies every day. And if people want to keep getting the services they provide, they're going to keep trading data for it.

The result in a few years will be staggeringly complex statistical models designed to predict your behavior and to identify personality types, including those prone to violence or terrorism. Congress has passed a law barring health insurers and employers from using DNA to discriminate. Beyond this, however, we have few protections.

Genetic predictions will not be perfect or deterministic. It turns out that DNA is only part of the equation that makes you who you are or will be. Using genetic profiling for identifying terrorists or other personality types will also be imprecise and fraught with errors. Yet the more data amassed about individuals over time, the more accurate the modeling that creates the predictions.

For instance, scientists in a 2008 study associated a variant of the MAOA gene—the so-called "warrior gene"—to a predilection for violent behavior in some people. The statistical strength of this correlation is weak, and even if you have that genetic marker, you may in fact be a full-on pacifist. But let's say that one afternoon you as a carrier of this gene variant "liked" an essay by a former Palestinian commando-turned-diplomat. An hour later you got curious about Al-Qaeda and did a quick Google search. What if some search algorithm at the NSA then connected your social media data to your DNA? The next thing you know, the Transportation Security Administration is stopping you from boarding your flight home for the holidays.

This is just one hypothetical example. As we rush into an era of bigger and better data being crunched by legions of government and public sector employees, we may have to get used to our health information being hacked and interpreted incorrectly or in ways that might work against us. Of course, it would be better to have an open debate and transparent policies about this type of data now.

Failing that, we may wake up one morning to read that the NSA once again has been spying on us—only this time, it won't be about who we called or texted, but the secrets buried deep inside our cells that tell us a great deal about who we are and who we might be in the future.

COMMENTS: Pellionisz • a minute ago

David is a well known expert - and his sense of urgency imho is correct. I am also afraid, however, that his warning will be similarly disregarded by many as my 2008 YouTube "Is IT Ready for the Dreaded DNA Data Deluge?". Billions of dollars of investments into mere sequencing lost tremendous valuation, without matching analytics. Now, again, it is very clear what to do. Mere data, pervasive as they will be, will be rather useless without those who know what to do with it. When the emerging nuclear industry yielded immense power, the "knee-jerk reaction" of the World was a mad rush to buy up the experts, even at hyper-escalating prices. A global competition is already afoot. Those powers that don't wake up to genomics, will slide, others will rise high, to the losers' expense. Juan Enriquez said it all in his 2001 best-seller "As the future catches you". We have been warned... - comment by Dr. Pellionisz

Really? One does not understand the genome by having sequenced it all? - Now we learn from JAMA

["War and Peace" in Russian has a thousand times fewer letters than the human genome - try to understand either by "reading it" from cover-to-cover!]

Clinical Application of Whole-Genome Sequencing - Proceed With Care

William Gregory Feero, M.D., Ph.D. [JAMA]

Clinical Interpretation and Implications of Whole-Genome Sequencing

Frederick E. Dewey, MD1,2,3,4; Megan E. Grove, MS1,2,3,4; Cuiping Pan, PhD4,5; Benjamin A. Goldstein, PhD6; Jonathan A. Bernstein, MD, PhD7; Hassan Chaib, PhD4,5; Jason D. Merker, MD, PhD8; Rachel L. Goldfeder, BS9; Gregory M. Enns, MB, ChB7; Sean P. David, MD, DPhil6; Neda Pakdaman, MD6; Kelly E. Ormond, MS5,10; Colleen Caleshu, MS1,2,3,7; Kerry Kingham, MS11; Teri E. Klein, PhD5; Michelle Whirl-Carrillo, PhD5; Kenneth Sakamoto, MD3,6; Matthew T. Wheeler, MD, PhD1,2,3,4; Atul J. Butte, MD, PhD7,12; James M. Ford, MD, PhD11; Linda Boxer, MD6; John P. A. Ioannidis, MD, PhD6,12,14,15; Alan C. Yeung, MD2,3; Russ B. Altman, MD, PhD5,6,16; Themistocles L. Assimes, MD, PhD2,3; Michael Snyder, PhD2,4,5; Euan A. Ashley, MRCP, DPhil1,2,3,4,5; Thomas Quertermous, MD1,2,3,4

[-] Author Affiliations

1Stanford Center for Inherited Cardiovascular Disease, Stanford, California

2Stanford Cardiovascular Institute, Stanford, California

3Division of Cardiovascular Medicine, Stanford University, Stanford, California

4Stanford Center for Genomics and Personalized Medicine, Stanford, California

5Department of Genetics, Stanford University, Stanford, California

6Department of Medicine, Stanford University, Stanford, California

7Department of Pediatrics, Stanford University, Stanford, California

8Department of Pathology, Stanford University, Stanford, California

9Biomedical Informatics Training Program, Stanford University, Stanford, California

10Stanford Center for Biomedical Ethics, Stanford, California

11Division of Medical Oncology, Stanford University, Stanford, California

12Division of Systems Medicine, Stanford University, Stanford, California

14Stanford Prevention Research Center, Stanford, California

15Department of Health Research and Policy, Stanford University, Stanford, California

16Department of Bioengineering, Stanford University, Stanford, California

JAMA. 2014;311(10):1035-1045. doi:10.1001/jama.2014.1717.

Importance Whole-genome sequencing (WGS) is increasingly applied in clinical medicine and is expected to uncover clinically significant findings regardless of sequencing indication.

Objectives To examine coverage and concordance of clinically relevant genetic variation provided by WGS technologies; to quantitate inherited disease risk and pharmacogenomic findings in WGS data and resources required for their discovery and interpretation; and to evaluate clinical action prompted by WGS findings.

Design, Setting, and Participants An exploratory study of 12 adult participants recruited at Stanford University Medical Center who underwent WGS between November 2011 and March 2012. A multidisciplinary team reviewed all potentially reportable genetic findings. Five physicians proposed initial clinical follow-up based on the genetic findings.

Main Outcomes and Measures Genome coverage and sequencing platform concordance in different categories of genetic disease risk, person-hours spent curating candidate disease-risk variants, interpretation agreement between trained curators and disease genetics databases, burden of inherited disease risk and pharmacogenomic findings, and burden and interrater agreement of proposed clinical follow-up.

Results Depending on sequencing platform, 10% to 19% of inherited disease genes were not covered to accepted standards for single nucleotide variant discovery. Genotype concordance was high for previously described single nucleotide genetic variants (99%-100%) but low for small insertion/deletion variants (53%-59%). Curation of 90 to 127 genetic variants in each participant required a median of 54 minutes (range, 5-223 minutes) per genetic variant, resulted in moderate classification agreement between professionals (Gross κ, 0.52; 95% CI, 0.40-0.64), and reclassified 69% of genetic variants cataloged as disease causing in mutation databases to variants of uncertain or lesser significance. Two to 6 personal disease-risk findings were discovered in each participant, including 1 frameshift deletion in the BRCA1 gene implicated in hereditary breast and ovarian cancer. Physician review of sequencing findings prompted consideration of a median of 1 to 3 initial diagnostic tests and referrals per participant, with fair interrater agreement about the suitability of WGS findings for clinical follow-up (Fleiss κ, 0.24; P < 001).

Conclusions and Relevance In this exploratory study of 12 volunteer adults, the use of WGS was associated with incomplete coverage of inherited disease genes, low reproducibility of detection of genetic variation with the highest potential clinical effects, and uncertainty about clinically reportable findings. In certain cases, WGS will identify clinically actionable genetic variants warranting early medical intervention. These issues should be considered when determining the role of WGS in clinical medicine.

[Predictably, the JAMA study and editorial created a "media-frenzy" all over the net. The one below is just one of the those too many to list - presently the number of related popular articles is 145. One wonders just how many it takes, since in my Google Tech Talk YouTube six years ago I warned about the "Dreaded DNA Data Deluge". In the talk, I also disclosed for the general public my peer-reviewed science paper The Principle of Recursive Genome Function; offering an alternative to junk/dogma. The paper has been downloaded thousands of times, with zero objection. Still, only the bravest dared to cite ... Pellionisz]


By Jason Koebler


Despite the promise of the $1,000 genome sequence, personalized medicine is likely to remain a luxury product—and not a very accurate one at that. For the foreseeable future, researchers will continue to struggle to overcome significant hurdles that make predicting someone’s propensity for disease expensive, time-consuming, and potentially unreliable.

A new study on the present-day feasibility of whole-genome sequencing for clinical use by researchers at Stanford University found that it will cost at least $17,000 per person to sequence a genome and interpret the results, and it’ll take roughly 100 man-hours to perform any sort of meaningful analysis.

“The gist of it is, we found that the results are generally not clinically acceptable,” said Frederick Dewey, lead author of the analysis, published in the Journal of the American Medical Association. “It’s a relatively sobering thought, and there are tough hurdles to get over before this is common.”

Dewey and his team completely sequenced the genomes of 12 people and analyzed them to predict their propensity for genetic diseases and other health concerns. The study was designed to test out some of the leading genome sequencing techniques and analysis methods. The expensive part, Dewey said, isn’t necessarily the sequencing of the genome (a cost that is constantly coming down) but the analysis after the fact. 

“The main challenge at this point, assuming you have technically valid data, is from a manpower perspective,” he said. “What we have learned is that a sequence doesn’t equal interpretation—there’s a significant manual interpretation job afterward.” 

But that’s not all. The reason the results were not deemed clinically acceptable in most cases is because, even with the most advanced sequencing techniques, the researchers found that, on average, between 10 and 19 percent of inherited disease genes “were not consistently covered at a read depth that was sufficient for a comprehensive survey of genetic variants.” In general, whole genome sequencing wasn’t great at picking up nucleotide insertion and deletion mutations—with one commonly used sequencing method, just one third were picked up. That’s important because those mutations are commonly medically relevant.

The study wasn’t all bad news, however. Of the dozen subjects whose genomes were sequenced and analyzed, the team found a woman with no family history of breast cancer who had a mutation in the BRCA1 gene that makes her much more likely to develop the disease. 

Dewey said that, although the results of this initial study suggest that personalized genomic medicine isn’t quite ready for primetime,  all of the problems the team found are ones that researchers are actively working on. 

“It’s remarkable how far we’ve come, and I would expect the pace of discovery and innovation will continue,” he said. “There’s a long way to go, but what we’ve done so far is far beyond anything we would have anticipated years ago.”


Andras Pellionisz · Member of Board of Advisers to DRCcomputer, Sunnyvale, California

Dr. Karp of the Simons’ Institute of Computer Theory of Berkely uses an interesting example how inexpensive it became to obtain a full human DNA sequence. Though the first cost $3 Bn, price plummeted to a couple of hundred dollars; “if the price of cars would have dropped similarly, the price of a Ferrari would be 40 cents”. We all understand cars – they get us somewhere. Where do 6.2 Bn A,C,T,G letters get us “if we only have letter squences without the mathematical understanding of their meaning"? Try understanding “War and Peace” in Russian, without knowing the language. The human DNA is about a thousand times longer and scientists are not sure if it has to be read linearly, parallel or in a “chaotic” manner. Increasing number of mathematicians/Institutes recognize that genome function is is essential in medicine. The naïve era when “genome theory” only consisted of “junk and dogma” is dead. With sufficient resources mathematical genomic theory will yield sophisticated recursive algorithms that will take us on a new path in medicine.

No longer junk: Role of long noncoding RNAs in autism risk

SFARI Simons Foundation Autism Research Initiative

Nikolaos Mellios, Mriganka Sur

4 March 2014

RNA acts as the intermediary between genes and proteins, but the function of pieces of RNA that do not code for protein has, historically, been less clear. Researchers have ignored these noncoding RNAs until recently for not complying with the central dogma of biology — that a straight line runs from gene to RNA (transcription) to protein (translation). However, noncoding RNAs are emerging as important regulators of diverse cellular processes with implications for numerous human disorders.

Extensive research has already examined the function of microRNAs, a category of small evolutionarily conserved noncoding RNAs about 22 to 24 nucleotides in length that target protein-coding genes in a sequence-specific manner. A plethora of microRNAs are important for brain function and neuropsychiatric diseases, including autism1.

In the past decade, long noncoding RNAs (lncRNAs), which extend longer than 200 nucleotides, have emerged as additional important players in the control of gene expression. They fine-tune the expression of numerous genes and direct the activity of complex regulatory pathways, often in a cell- and developmental-stage-specific manner.

They are found in many places in the genome: within genes, near gene regulatory regions or by themselves (intergenic noncoding RNAs). lncRNAs may overlap with the genetic code for a protein or be expressed in the opposite, or antisense, direction.

In addition to the diversity in their biogenesis, lncRNAs exhibit an impressive versatility of molecular functions. These range from passive influence on the transcription of nearby genes to limiting expression to a paternal or maternal chromosome, a process called imprinting, and inactivating one copy of the X chromosome.

They also interact with chromatin-modifying complexes, which regulate gene expression by changing the packaging of DNA, and with transcription factors that directly regulate gene expression. They may influence RNA splicing, stability and localization and play a role in the translation of RNA to protein and in protein activation. Finally, they may ‘sponge’ up certain microRNAs, thus blocking their function2, 3, 4, 5.

Molecular multitaskers:

The ability of lncRNAs to engage in such molecular multitasking may allow them to link multiple risk factors for genetic disorders into functional networks. This makes them attractive candidates for autism spectrum disorders, which are characterized either by interactions of multiple genes or by disruptions in a single gene that influences numerous molecular pathways.

Whether whole-genome DNA sequencing data will reveal strong genetic links with lncRNAs, as it has for microRNAs, is not yet clear. One thing, though, remains certain: We can no longer overlook such a substantial and active chunk of the transcriptome and characterize it as ‘junk’ or ‘transcriptional noise’ if we hope to fully understand complex disorders such as autism.

In the past few years, studies have found alterations in lncRNAs in brains from people with autism, suggesting that they contribute to autism risk. For example, MSNP1AS, a lncRNA transcribed from a region of chromosome 5 that carries an autism-associated variant, is elevated in the cortex of people with autism who also carry the disease-related variant6. MSNP1AS may regulate moesin, a gene important for the structure of neurons’ signal-receiving branches, or dendrites, and immune system activation.

Last year, a carefully conducted study identified numerous lncRNAs that are robustly dysregulated in autism postmortem brain samples7. Impressively, some disease-altered lncRNAs are found near important autism-linked genes such as BDNF and SHANK2.

Another lncRNA with potential implications for autism is LOC389023, which regulates DPP10, a gene linked to autism and other neurodevelopmental disorders. DPP10 controls the structure and function of neuronal junctions, or synapses, via its effects on potassium ion channels3.

Last year, researchers used a similar approach to study the expression of lncRNAs in a mouse model of Rett syndrome8. One lncRNA (AK081227) that is expressed at abnormal levels in these mice controls the expression of its host protein-coding gene, the gamma-aminobutyric acid receptor subunit Rho 2 (GABRR2), which has also been linked to autism.

Additional reports have linked other lncRNAs to autism, such those that travel antisense to the FMR19, 10 and UBE3A11, 12 genes. Mutations in these genes underlie fragile X syndrome and Angelman syndrome, respectively. Other studies have also uncovered a subset of lncRNAs expressed from the autism-linked PTCHD1 gene13 and the 7q31 chromosomal region14.

In addition, the lncRNA ZNF127AS has altered expression in the brains of people with Prader-Willi syndrome15. On a similar note, a cluster of small nucleolar RNAs — which despite their name are a category of lncRNAs — are encoded by the paternally inherited microdeletion at 15q11.2 that is also linked to Prader-Willi syndrome16.

Brain builders:

Previous work has identified a subset of lncRNAs that are important for regulating the birth of new neurons, or neurogenesis, and the process by which synapses adapt to experience, called synaptic plasticity.

Of particular importance is the finding that the intergenic noncoding RNA MALAT1, one of the most highly expressed lncRNAs in the brain, can regulate the formation of new synapses, or synaptogenesis. It does this by associating inside the nucleus with multiple RNA splicing factors and influencing the expression of autism-linked genes, such as NLGN117.

Intriguingly, there are several other links between MALAT1 and autism-associated factors. For example, beta-catenin — an important component of the WNT signaling pathway that has been linked to multiple neuropsychiatric disorders — activates MALAT1 transcription18. CREB, another transcription factor known for its role in activity-dependent gene expression, also binds to MALAT1. Notably, CREB may control MALAT1 transcription following exposure to the peptide hormone oxytocin, which has also been linked to autism19.

MALAT1 and another lncRNA, BDNFOS, which has the antisense, or opposite, code to that of the autism-linked BDNF gene, are expressed in conjunction with neuronal activity20. On the other hand, GOMAFU, a lncRNA whose levels are dampened in postmortem brains from people with schizophrenia, is significantly suppressed following the activation of mouse cortical neurons21.

Other lncRNAs run antisense to important synaptic plasticity-related genes, such as NRGN, CAMK2N1 and CAMKK122, 23. lncRNAs are also associated with genes linked to changes in the synapse that occur after exposure to cocaine24. Interestingly, a novel subset of lncRNAs are expressed from the regulatory elements of genes, such as c-FOS and ARC, that regulate gene transcription in response to neuronal activity25.

Adding to their important role in brain plasticity, lncRNAs are highly expressed during prenatal neurogenesis and are important for maintaining and differentiating the precursors to neurons: neural stem cells and neuronal progenitors26, 27. Of particular interest is the lncRNA EVF2, which runs antisense to the regulator gene DLX5,6 and plays a crucial role in the birth of neurons that dampen brain activity28. This adds another layer to the role of lncRNAs in cell-type-specific neuronal functions.

Despite these many threads, much more work is needed to determine the exact mechanisms of action and the physiological significance of lncRNAs for autism and other neurodevelopmental disorders.

Mriganka Sur is professor of neuroscience at the Massachusetts Institute of Technology, in Cambridge. Nikolaos Mellios is a postdoctoral fellow in his laboratory.

News and Opinion articles on are editorially independent of the Simons Foundation.


1. Mellios N. and M. Sur Front. Psychiatry 3, 39 (2012) PubMed
2. Qureshi I.A. and M.F. Mehler Nat. Rev. Neurosci. 13, 528-541 (2012) PubMed
3. Tushir J.S. and S. Akbarian Neuroscience Epub ahead of print (2013) PubMed
4. Mercer T.R. and J.S. Mattick Nat. Struct. Mol. Biol. 20, 300-307 (2013) PubMed
5. Bak R.O. and J.G. Mikkelsen Wiley Interdiscip. Rev. RNA Epub ahead of print (2013) PubMed
6. Kerin T. et al. Sci. Transl. Med. 4, 128ra40 (2012) PubMed
7. Ziats M.N. and O.M. Rennert J. Mol. Neurosci. 49, 589-593 (2013) PubMed
8. Petazzi P. et al. RNA Biol. 10, 1197-1203 (2013) PubMed
9. Ladd P.D. et al. Hum. Mol. Genet. 16, 3174-3187 (2007) PubMed
10. Pastori C. et al. Hum. Genet. 133, 59-67 (2014) PubMed
11. Chamberlain S.J. and C.I. Brannan Genomics 73, 316-322 (2001) PubMed
12. Le Meur E. et al. Dev. Biol. 286, 587-600 (2005) PubMed
13. Noor A. et al. Sci. Transl. Med. 2, 49ra68 (2010) PubMed
14. Vincent J.B. et al. Genomics 80, 283-294 (2002) PubMed
15. Jong M.T. et al. Hum. Mol. Genet. 8, 783-793 (1999) PubMed
16. Sahoo T. et al. Nat. Genet. 40, 719-721 (2008) PubMed
17. Bernard D. et al. EMBO J. 29, 3082–3093 (2010) PubMed
18. Wang J. et al. Cell Signal. 26, 1048-1059 (2014) PubMed
19. Koshimizu T.A. et al. Life Sci. 86, 455-460 (2010) PubMed
20. Lipovich L. et al. Genetics 192, 1133-1148 (2012) PubMed
21. Barry G. et al. Mol. Psychiatry Epub ahead of print (2013) PubMed
22. Ling K.H. et al. Cereb. Cortex 21, 683-697 (2011) PubMed
23. Mercer T.R. et al. Neuroscientist 14, 434-445 (2008) PubMed
24. Bu Q. et al. J. Neurochem. 123, 790-799 (2012) PubMed
25. Kim T.K. et al. Nature 465, 182-187 (2010) PubMed
26. Ng S.Y. et al. EMBO J. 31, 522-533 (2012) PubMed
27. Sauvageau M. et al. Elife 2, e01749 (2013) PubMed
28. Bond A.M. et al. Nat. Neurosci. 12, 1020-1027 (2009) PubMed

[A foot-note is, perhaps naive, that the barely nascent mathematical (software enabling) foundation of genomics may become a show-stopper to the zillion-dollar present investments into Globalized Industrial Genomics, maybe sooner than we think. (Can you picture nuclear industry without quantum mechanics - could have been orders of magnitude more expensive and very dangerous). In this spirit, see some further studies here, especially for the mathematical interpretation of the RNA system, and the already demonstrated implications of fractal defects in cancers, autism, schizophrenia and auto-immune disorders. A regular collective PubMed publication printed in 2014 is here - Dr. Pellionisz]

Craig Venter's Latest Startup Gets $70 Million to Sequence Loads of Genomes

by Bruce V. Bigelow, The Motley Fool Mar 4th 2014

J. Craig Venter, the human genome pioneer, today unveiled a new San Diego-based venture with an ambitious goal of providing whole genome sequencing and cell-therapy-based diagnostic services for patients.

Venter said he co-founded the company, Human Longevity, or HLI, with Robert Hariri, who oversaw Celgene Cellular Therapeutics and Peter Diamandis of the X Prize Foundation. The company already has raised $70 million in Series A venture financing that includes a Malaysian investment fund that also is the lead investor of another Venter venture, Synthetic Genomics, San Diego-based Illumina, and other individual investors.

Venter plans to serve as the chairman and CEO of both HLI and San Diego-based Synthetic Genomics, which he co-founded in 2005 to engineer genes within organisms so they can produce fuel, chemicals, medicines, and nutritional products.

The HLI effort, which Venter and others repeatedly described as "unprecedented" in a conference call this morning, will initially focus on basic research and development, sequencing every cancer patient who comes into the UC San Diego Moores Cancer Center, as well as other patients with diabetes, obesity, heart and liver diseases, and dementia.

In a statement this morning, HLI says it has established collaborative research and development partnerships with UC San Diego, Metabolon, and the J. Craig Venter Institute. North Carolina-based Metabolon is expected to provide information about patients' metabolytes, key chemicals in their blood.

HLI also has formed a Biome Healthcare division, led by Karen Nelson, to generate gene sequencing data of the microbiome for many patients. The microbiome consists of all the microbes that live in the human gut (and elsewhere in and on the human body). New research by Nelson and others suggest that such bacteria play a key role in human health and disease.

The early goal at HLI is to sequence 40,000 human genomes a year, and amass the world's largest database of human genome sequences. HLI plans to combine its human genome data with microbiome data and phenotype databases to develop new treatments, including stem cell therapies, for aging-related diseases. Eventually, HLI intends to sequence 100,000 human genomes a year.

"We will be using that [data] to make numerous new discoveries in preventative medicine," Venter said. "We think this will have a huge impact on changing the cost of medicine and broad human health."

HLI plans to make money by providing access to its database to pharmaceutical, biotechnology and academic organizations, gene sequencing, and by developing new medical diagnostics and therapeutics. As HLI's genome-sequencing capacity increases, the company plans to sequence people of all ages -- from infants to super centenarians -- including those who are healthy as well as those with disease.

Genomic data would be shared among researchers participating in the effort, although Venter said details about protecting patient privacy are still being worked out.

DFJ partner Steve Jurvetson, who said he's an investor in HLI, asked Venter during the call to compare the effort to BGI China, one of the world's premier gene sequencing centers. BGI says on its U.S. website that it has relationships with 17 out of the top 20 global pharmaceutical companies as part of its suite of commercial science, health, agricultural, and informatics services.

Venter answered that HLI will be working with more advanced gene sequencing equipment (Illumina's HiSeq X Ten Sequencing Systems) and will have higher throughput capabilities. But he added that there is plenty of room for gene sequencing services. In fact, Venter envisions a future where genome sequencing is done routinely for every patient admitted to a hospital.

"We cannot have enough players in the human genome sequencing field," Venter said. "We're just trying to elevate it to a new level."

With $70 million in initial funding, "We think that will carry us through the first 18 months as we build these operations," Venter said. HLI already is using laboratories in San Diego for high-throughput microbiome and human genome sequencing, and has plans to build a new facility near the UC San Diego campus. HLI plans to hire about 100 scientists and others over the next year, Venter said.

"This is a revolution in biomedical research," said David A. Brenner, vice chancellor for health sciences and dean of the UC San Diego School of Medicine. "This is the first time ever as physician-scientists that we have had the opportunity to handle very large datasets and to be able to compare the genetics of a patient with the microbiome, as Dr. Nelson said, and with the metabolytes in the blood to try to gain new insights into our understanding of disease pathogenesis, into diagnoses, and hopefully, into new therapies and cures."

[Knowing the players over a decade, Steve Jurvetson's question may have implied that "sequencing is a necessary but not sufficient condition to software analysis". BGI has had close to 10,000 software developers (average age 27, monthly salary $411). Brightest investors of Silicon Valley may ask how to outsmart such a competition by hiring 100 - Dr. Pellionisz]

GE Ventures support RainDance’s vision to make liquid biopsy a commercial reality

RainDance Technologies Closes Series E Financing with GE Ventures

March 2, 2014 2:02 AM

Business Wire

BILLERICA, Mass.–(BUSINESS WIRE)–March 3, 2014–

RainDance Technologies, Inc., an innovative genomics tools company making molecular testing of complex diseases more standardized and readily available, today announced that it has closed a $16.5 million Series E financing round extension. New investors include GE Ventures and Northgate Capital; all existing financial investors also participated in this round of financing.

The additional investment accelerates RainDance’s commercial expansion and assay development initiatives for the company’s products:

RainDrop™ digital PCR system, which provides unparalleled sensitivity for research into early detection and monitoring of cancers, viruses, pathogens and immune markers;

ThunderStorm™ targeted sequencing system, which advances research into and understanding of hereditary risk, inherited diseases and hematologic malignancies; and,

ThunderBolts™ Cancer Panel, which delivers an actionable, fast, simple and low cost sequencing profile of somatic tumor mutations in molecular biopsies or biobanked samples.

GE Ventures now becomes the second strategic investor in RainDance, following an investment from Myriad Genetics in 2013, and shares its mission of addressing healthcare issues by accelerating development and commercialization of products that improve patient care and save lives. The liquid biopsy method using readily accessible fluids currently in development by RainDance is minimally-invasive, and a low cost healthcare solution for patients.

“We believe technology will play a major role in transforming patient care and by partnering with leading innovators we can help scale the best new ideas in major industries like healthcare,” said Sue Siegel, CEO of GE Ventures and healthymagination. “We have followed RainDance’s progress for many years and are impressed with how the company’s cutting edge technologies are advancing the market and helping to bring in a new era of more accurate, non-invasive and cost-efficient testing for complex genetic disease research.”

Roopom Banerjee, RainDance President and CEO, noted, “We welcome GE Ventures and Northgate as new investors. GE brings an unparalleled global network of businesses and partners, and is a logical outgrowth of the work we have under way with industry and research leaders committed to making the world healthier. Our success bringing in strategic investors such as GE Ventures and Myriad Genetics demonstrates we are a trusted platform partner to large customers in advancing the understanding, detection and monitoring of cancer and other important diseases.”

RainDance has raised more than $100 million from investors including Mohr Davidow Ventures, Quaker BioVentures, Alloy Ventures, Acadia Woods Partners, Sectoral Asset Management, Northgate Capital and Capital Royalty Partners, to support the development and commercial expansion of its leading technologies for Digital PCR and Next-Generation Targeted Sequencing (NGS). The RainDrop Digital PCR System, which has 50x to 1000x the sensitivity of traditional quantitative PCR, enables researchers to analyze low-frequency alleles with true single molecule detection capability. Separately, RainDance’s ThunderStorm System enables commercial laboratory customers to develop their own proprietary next-generation sequencing panels for disease research. RainDance products generally feature an ‘open system’ that leverages decades and billions of dollars already invested in NGS and PCR chemistries and innovations.

Most recently RainDance announced that it has developed the ThunderBolts™ Cancer Panel, a comprehensive NGS panel, for profiling actionable cancer mutations. ThunderBolts enables researchers to rapidly and cost-effectively analyze precious biopsy specimens such as those contained in bio-banks worldwide. The ThunderBolts Cancer Panel is currently being made available to select First Access Program (FAP) participants. Researchers from FAP sites will present data at a workshop at the American Association for Cancer Research (AACR) Annual Meeting in April 2014 in San Diego, California. For more information, on the ThunderBolts Cancer Panel, please visit

Cowen Group 34th Annual Health Care Conference

Roopom Banerjee will be presenting more information on RainDance during the Cowen Group’s 34th Annual Health Care Conference on March 3, 2014 at 10:00am at the Boston Marriott Copley Place in Boston.

About GE Ventures

GE Ventures is committed to identifying, scaling and accelerating ideas that will make the world work better. Focused on the areas of software, advanced manufacturing, energy and healthcare, GE Ventures helps entrepreneurs and start-ups succeed by providing access to GE’s technical expertise, capital and opportunities for commercialization through GE’s global network of business, customers and partners. GE Ventures offers an unparalleled level of resources through its Global Research Center, including: 35,000 engineers; 5,000 research scientists; 8,000 software professionals; as well as 40,000 sales, marketing and development resources in over 100 countries. For more information, please visit

About RainDance Technologies

RainDance Technologies is making complex genetics simple. The company’s ultra-sensitive genomic tools are leading to new non-invasive liquid biopsy applications for more accurate, reliable, cost-effective and early detection of cancer, inherited and infectious diseases. Major research institutions and more than 70 leading translational, genetic and commercial laboratories around the world rely on RainDance systems’ superior performance. Based in Billerica, Massachusetts, the company supports customers using RainDrop Digital PCR and ThunderStorm Targeted DNA Sequencing Systems through its international sales and service operations as well as a global network of distributors and commercial service providers.

Google Launches Genomics Effort, Joins Global Alliance

GEN News HighlightsMore »

Mar 3, 2014

Alex Philippidis

[Looks familiar? Progress is often characterized by a "7-year cycle". Close to this timeframe (with 7 years ahead, perhaps reserving some bragging rights) Google Tech Talk YouTube predicted in 2008 at time 11:19 that the "parallel computing architecture" is not much of a challenge to Google - essentially a matter of time and/or money (buy/build). However, I also pointed out that while "Information Technology" is pretty much a given (with ample leveraging of Google/Amazon/Oracle/Sony/Samsung/Siemens/GE Health and zillions to join the fray, even Cisco not excluded - the global competition no longer gracious to permit further wasted decades) - "IT2" that I called "Information THEORY" of hologenomics had to built anew (The Principle of Recursive Genome Function peer-reviewed science paper was announced at the talk, cited thus far only by the top dozens would dare to adopt). Even two months ago (Dec. 30, 2013) there was still a retirded professor closer to the North Pole who publicly claimed "90 percent of the human genome was (still) JUNK"! As we see in the footer, having ventured into several aspects of genomics, Larry Page apparently decided the "must" in genomics; "focus is on the science". Is Google going to do "in house" breakthrough hologenome science? Since it takes less than 5 minutes to realize the (now) obvious, they might. If others innocently bring it "in house", even better... Would Google wait till NIH "cooperates with industry" at $150k levels? This definitely looks like an antiquated and now expired option! - Would a dominance be established by truly major player(s) on an exclusive or non-exclusive manner by securing IP? Dr. Pellionisz, Holgentech, Inc.]

Google has unveiled Google Genomics, a proposal for a web-based application programming interface (API) designed to import, process, store, and search genomic data at scale, while being simple to use.

At the same time, Google said it joined the Global Alliance for Genomics and Health, an international effort aimed at developing common approaches for responsible, secure, and effective sharing of genomic and clinical information in the cloud with the research and healthcare communities.

The internet search and services giant is one of 146 top technology, healthcare, research, and disease advocacy organizations worldwide in the global alliance, which vowed that its harmonized standards will meet the highest standards of ethics and privacy.

Established last year, the alliance has drawn numerous research institutions, five of which were named interim host institutions: the Broad Institute and Brigham and Women's Hospital, the Ontario Institute for Cancer Research, and the Wellcome Trust Sanger Institute / European Bioinformatics Institute.

The alliance has also drawn members that include drug developers (Amgen, Biogen Idec, and Merck & Co.), sequencing giants (BGI-Shenzhen and Illumina), and cloud-based genomic analysis firm DNAnexus – which in January said it closed on a $15 million Series C financing round co-led by Google Ventures.

Google Ventures has also invested in direct-to-consumer (DTC) genetic testing company 23andMe. And in September, Google extended its healthcare presence by launching Calico, a new company focused on developing technologies to fight aging and associated diseases. Calico is led by Arthur Levinson, the chairman of Roche’s Genentech subsidiary and former CEO before the company got bought, as well as chairman of Apple and former director of Google. Not all Google initiatives in healthcare have been successful. In 2011, the company shut down an electronic health records effort with the same goal as Calico, but which failed to catch on.

To launch Google Genomics, Google has previewed its implementation of the API built on its cloud infrastructure, including sample data from public datasets like the 1,000 Genomes Project, and made public a collection of in-progress open-source sample projects built around the common API.

In unveiling the API as a limited release for discussion by the research community, Google cautioned that not all of the interface’s functionality had been implemented yet, and that the company’s focus to date in launching Google Genomics had been the shape of the API more than its performance.

But Google also laid out what it sees as the benefits of its API to prospective users: It allows users to focus on science rather than tech details such as servers and file formats; store genomic data securely so that private data remains private, while public data is available to the community anywhere; and process as much data as they need, all at once. For example, the API would let users import data for entire cohorts in parallel, as well as search and slice data from many samples in a single query.

“With these first steps, it is our goal to support the global research community in bringing the vision of the Global Alliance for Genomics and Health to fruition,” Jonathan Bingham, product manager with Google, said in a post on the company’s research blog. “Imagine the impact if researchers everywhere had larger sample sizes to distinguish between people who become sick and those who remain healthy, between patients who respond to treatment and those whose condition worsens, between pathogens that cause outbreaks and those that are harmless. Imagine if they could test biological hypotheses in seconds instead of days, without owning a supercomputer.”

The blog post included a link allowing would-be users to request access to the API for their research, and requesting that they tell about themselves and their research interests: “We will let you know when we’re ready to work with more partners.”

“Together with the members of the Global Alliance for Genomics and Health, we believe we are at the beginning of a transformation in medicine and basic research, driven by advances in genome sequencing and huge-scale computing,” Bingham added.

[What to make of Google Life, 23andMe, Calico, Google Genome? Consult Dr. Pellionisz - Holgentech, Inc.]

Google Could Disrupt These 3 Medical Industries Within 10 Years

By Leo Sun
Motley Fool

February 1, 2014

[When will Larry Page "go fractal" (composed of self-similar repetitions)? - We all know that the Internet itself is fractal, see at 42:40 of Google Tech Talk YouTube, and that "fractal genome grows fractal organisms"; cancerous if the fractal has defects. Thus, the question is not if, but when. Larry Page need not be told that powerful algorithmic approaches yield the utmost utility - AJP]

Search giant Google (NASDAQ: GOOG) is a master market disruptor. Mobile devices, location-based services, online advertising, email, and news have all been disrupted by Google to some degree over the past 16 years.

One industry that hasn't been associated with Google, however, is the rapidly growing health care market.

Yet in my opinion, Google could take advantage of this convergence and disrupt three specific fields of health care over the next 10 years, disrupting the traditional business models of industry stalwarts like pharmaceutical giant Roche (NASDAQOTH: RHHBY) , medical device maker DexCom (NASDAQ: DXCM) , and EHR (electronic health records) provider Allscripts (NASDAQ: MDRX).

Google vs. the biopharmaceutical industry

Last September, Google created Calico (California Life Company), a new health care company focused on solving aging and related diseases. Calico's stated goals include extending the average human lifespan by 20 to 100 years and treating age-related diseases like Alzheimer's, cancer, and heart disease.

The quest to slow or reverse aging isn't a new one. In 2008, GlaxoSmithKline (NYSE: GSK) acquired Sirtris Pharmaceuticals for $720 million for its pipeline of sirtuins, a class of enzymes that are believed to be integral in slowing the aging process. So far, the technology has been a far cry from a "fountain of youth" -- a handful of studies testing sirtuin-based drugs on inflammatory diseases and type 2 diabetes have proved inconclusive.

However, Google has hired some of the brightest minds in the business to run Calico. Calico is led by Arthur Levinson, the former CEO of Genentech. Genentech, which was acquired by Roche in 2009 for $46.8 billion, today forms the core of Roche's cancer portfolio.

The blockbuster drugs that were developed while Levinson was Genentech's CEO -- Rituxan, Avastin, Tarceva, Herceptin, and many others -- not only improved the lives of cancer patients worldwide, but also define Roche as a company today. Together, those four aforementioned drugs generated over $20 billion in sales in fiscal 2013.

Hal Barron, Roche's former executive vice president, head of global product development, and chief medical officer, also joined Calico as its new president of research and development. David Botstein, the former head of Princeton University's Lewis-Sigler Institute of Integrative Genomics, is now Calico's chief science officer.

All of these hires indicate to me that Google is positioning Calico to become the next Genentech. While Calico's pipeline hasn't been revealed yet, in 10 years Calico could be manufacturing next-generation cancer drugs that could challenge Roche/Genentech's massive footprint in oncology.

Google vs. the medical device industry

Google has also been experimenting with smart medical devices. Its latest effort, the smart contact lens for diabetic patients, tries to read glucose levels from tears, which could render painful pinpricks obsolete. Google is also testing the lens' ability to synchronize wirelessly with a mobile device, similar to Johnson & Johnson's OneTouch Verio Sync Meter, which was approved in March 2013.

If Google's smart contacts are approved, companies that primarily make continuous glucose monitors, such as DexCom, could be in trouble.

DexCom is one of the fastest-growing names in the field -- last quarter, its product revenue surged 101% year over year to $42.5 million while its net loss narrowed from $0.25 to $0.08 per share.

More importantly, DexCom and Johnson & Johnson are partnered in the creation of a wearable artificial pancreas to compete against Medtronic's MiniMed 530G, a first-generation device that was approved last September.

Google's smart contacts could disrupt both devices, since they both use continuous glucose monitoring (CGM) technology integrated with insulin pumps. A smart contact lens would displace the need for a CGM device, since it could synchronize wirelessly to the insulin pump instead.

While Google's smart contacts are experimental, they represent the search giant's vision for medicine, which leapfrogs over the current generation medical devices. The secretive Google X Team, which developed Google Glass, driverless cars, Wi-Fi balloons, and the smart contacts, recently held a meeting with the FDA in regards to an unnamed medical device.

Whatever Google has in store won't immediately disrupt DexCom's or Medtronic's medical device businesses, but a new generation of tiny cloud-connected medical devices might alter the market landscape within a decade.

Google vs. the EHR industry

Last but not least, Google could unite the fragmented EHR market. EHRs are considered an essential part of modernizing hospitals, since digitally streamlining patient records can boost productivity and minimize errors.

Unfortunately, the current market is a fragmented mess. According to an annual presentation last May by SK&A, Allscripts is the overall market leader with a 10.6% share of the market, thanks to its dominance of small practices with four to 10 physicians. Meanwhile, privately-held Epic is the leader in large practices with 41 or more physicians, and holds an overall market share of 10.3%.

That fragmentation has made the EHR market a very tough one to unify. Google tried once, with Google Health in 2008, but the service was scrapped in 2011 due to the fragmented state of the market, administrative apathy, and a lack of users.

The good news is that EHR technology has considerably improved over the past two years thanks to Apple's iPad gaining prominence in hospitals. Just three iPads can replace a computer on wheels (COW) and save a hospital $6,000, according to CareCloud's Ahmed Mori.

In April 2012, Allscripts launched a native iPad EHR app (untethered to a desktop platform), named Allscripts Wand. Native iPad EHR apps offer more touch-friendly menus, use Siri for voice-to-text documentation, and take advantage of the iPad's onboard camera for easier patient documentation.

Google Glass, which is scheduled to launch later this year, is the key to disrupting this market. Regardless of the convenience of native iPad EHRs, iPads can never be a hands-free experience. Last October, Philips and Accenture teamed up to launch a proof of concept of Google Glass as a medical device. The proof of concept device was able to display vital patient data and synchronize wirelessly to Accenture's EHR service, offering a truly hands-free, voice-controlled EHR experience.

Therefore, regardless of the EHR provider, Google Glass -- which also has an onboard camera, voice recognition, and cloud connectivity -- could soon replace the iPad as the preferred method of accessing EHRs.

The Foolish takeaway

In conclusion, investors in tech and health care should keep a close eye on Google's growing footprint in these medical industries.

While Google's projects seem like "moon shots" (as the company likes to call them), they could grow into major pillars of growth for the company in a decade and disrupt unexpected fields like biotech, medical devices, and EHRs.

Sony’s new genome analysis company is not playing catch-up with Calico

By Graham Templeton on January 27, 2014

In a move that is becoming increasingly common for international technology giants, Sony will make a startling new push into an area of research that doesn’t seem to overlap at all with its current projects. Diversification is the corporate buzzword of the day, a frantic attempt to spread out a company’s weight as it tries to keep from falling through ever-thinning economic ice. The internet is threatening established media, display technology is advancing down all sorts of new paths, and the cloud is even bringing the necessity of physical processing hardware into question; even a company as diverse as Sony must look at its frankly incredible array of products and wonder whether it might still be just a few Kickstarter success stories away from total irrelevance.

That being the case, the company has decided to enter into a partnership with Japanese medical giant M3 and genetics pioneer Illumina to create a new company named P5. Though details are scarce right now, the new company will focus on creating a “genetic information platform,” which we can safely assume means a proprietary stab at making salable products out of the promise of personalized medicine. Almost certainly the first planned device is a cheap genetic sequencing and annotation tool meant for quick identification of genetic problems or warning signs.

The short-term target of the company is the research and testing industry, but sales direct to the public are an openly stated goal, as well. The biomedical industry is already one of the largest economic sectors there is, and that’s while working entirely through third parties (insurance companies and medical professionals). If technology could allow companies like Sony to safely cut out such inherent inefficiencies, the market could explode even further.

This differs from Google’s announcement of Calico in a few key ways. First, Sony has a robust history as a hardware company, while Google still primarily makes software. Second, Sony’s goals seem much less lofty than Google’s; all modern hospitals already house equipment with at least a few recognizable electronics logos, and Sony just wants a (bigger) piece of that action. That genetic analysis will undoubtedly have the effect of lengthening lifespans doesn’t mean the two projects have the same goal. Sony wants to fill a very specific niche, and has a direct plan to recoup expenses; Google seems to have a much more general understanding of the relationship between health and profit, and trusts that a more wide-based approach will work out in the end.

In late 2012, Sony announced a general plan to enter the health industry with the explicit aim of cornering an emerging market. They see medicine as a new frontier in technology that could work its way into the average home. Could they sell you a hardware-software health kit that tracks and advises on health the way a financial suite might do for your budget? Could they sell you a testing platform at a loss, then recoup that loss through monthly health monitoring fees? Will you have to pay for version upgrades capable of testing more accurately, or for a wider array of problems?

This sort of research need not confine itself to hypotheticals, however. The $664 million purchase of camera-maker Olympus was reportedly to gain control of its medical endoscope business, but the research necessary to improve those devices could also drive the development of Sony’s 4K TVs. There are multiple possible applications for virtually every sort of research, and there are few if any companies better positioned to take advantage of the full spectrum of applications for a new technology. With many of Sony’s traditional businesses struggling in recent years, last year saw the company take in a majority of its profit from the financial services branch – this is a company that knows full well the importance of embracing new and seemingly uncharacteristic ideas.

Genome analysis is one of the fastest-emerging fields in the world, recently passing the $1,000 milestone and continuing to advance with no sign of slowing. Quite literally, the $100 genome might not be far away — and any company not poised to exploit that market the instant it appears could easily find itself frozen out for good. Sony and Google have big plans to make sure that doesn’t happen. All that remains to be seen is who else will throw their hat into the ring.

Sony Corp. intends to make its medical sector a core part of its business through a foray into genome information analysis, according to sources.

August 29, 2013


Sony plans to establish a joint venture with Illumina Inc., the world’s largest gene analysis equipment maker, and its own group company M3 Inc. as early as October, the sources said. It would take on contracts to analyze blood samples provided by hospitals and other medical institutions.

The company also plans to accumulate analysis data separately from personal information of the patients, and sell it to pharmaceutical firms, research institutes and other establishments, the sources said.

Last October, Sony President Kazuo Hirai announced that his company will raise sales of its medical business from the current tens of billions of yen to 200 billion yen ($2.04 billion) by 2020.

Genome information analysis would give Sony’s medical sector expansion a burst of momentum. It is utilized in a rapidly expanding range of medical fields, such as predicting diseases a person is likely to develop.

Hollywood star Angelina Jolie made headlines in May by disclosing that she had undergone a double mastectomy to prevent likely carcinoma after genetic testing found she was at a high risk of contracting breast cancer.

Currently in Japan, genomic analyses are conducted mainly by the Riken national research institute in Wako, Saitama Prefecture, and other large research establishments.

A senior Riken official said an increasing number of companies will likely enter the genome analysis business.

“Genome analysis has become easier to conduct thanks to the introduction of more advanced equipment,” said Naoto Kondo, director of Riken’s Genome Network Analysis Support Facility.

M3, which has been supplying information on medical papers as well as other services for doctors, already has a number of established clients in Japan. Utilizing M3’s name recognition and solid customer base, Sony aims to receive as many orders as possible for its new business.

SAP Pushes Easy-to-Use Software at Hub to Compete With Facebook

Bloomberg News

By Aaron Ricadela and Cornelius Rahn February 12, 2014

SAP AG (SAP)’s founder Hasso Plattner said the largest business-software maker’s programs remain too cumbersome to impress a generation of users weaned on slick apps made by Google Inc. and Facebook Inc.

To help attract the talent SAP needs to keep pace with users’ expectations, Plattner today opened a development center outside Berlin to incubate technology and foster young programmers. The “Innovation Center” sits on a Potsdam lakeside 9 kilometers (5.6 miles) from the research institute that bears the SAP chairman’s name.

SAP plans to make the site a key development hub and will potentially double its staff to 150 to work closely with technology startups, said a person familiar with the matter, who asked not to be named because the plan isn’t public.

“The biggest weakness of SAP worldwide is the user interaction,” said the 70-year-old, who was joined by other top SAP executives. “We don’t want to leave anyone behind, but we have to move this franchise forward.”

The company Plattner co-founded 42 years ago is undertaking a tricky transition from programs for managing supply chains and financial systems installed on customers’ computer servers to newer cloud-computing applications delivered over the Internet. Hand in hand with the change is a database called Hana and new user interface that offer customers faster response times and a more Web-like look.

400 Kilometers

“SAP has to be closer to the 150,000 students in Potsdam and Berlin,” Plattner said, wearing a purple sweater and addressing a room full of staff, customers and journalists. “Not everyone wants to come to Walldorf.”

Plattner was referring to SAP’s headquarters more than 400 kilometers southwest of Berlin. “We have to retrain 20,000 developers inside SAP.”

To be sure, re-making the business software to attract new customers isn’t as simple as opening a shiny development center near the German capital, a point Plattner underscored when he swatted away questions about other parts of the world rivaling California’s Silicon Valley.

The shift to cloud computing is also hurting profit. SAP has already pushed back its goal of reaching an operating margin by two years to 2017 because rented Web software costs less upfront than the programs it traditionally offered.

The company’s revenue may reach 19.1 billion euros ($26 billion) in 2015, according analysts’ estimates compiled by Bloomberg. The company had forecast 20 billion euros in sales for that year.

Late Cloud

“SAP is thinking more optimistically about growth of its core than we are,” Rick Sherlund, a Nomura Securities analyst who recommends buying SAP shares, said in a Feb. 4 note. “SAP is late to the cloud.”

To bridge the gap, the company is pouring effort into a simpler user interface and developing products SAP can continually update online.

Those two things have been traditionally difficult for SAP, but we have been putting a lot of energy into that recently,” Vishal Sikka, SAP’s chief technology officer and a board member, said in a recent interview.

At the two-day event where Sikka also spoke, SAP is showcasing how its software can be applied to new areas -- such as personalized medicine and sports statistics analysis -- that can help with the transition.

Co-Chief Executive Officer Bill McDermott also appeared, saying that by 2025, more than 70 percent of workers would belong to the “millennial” generation born after the early 1980s.

Hana Growth

The data-crunching Hana software -- an acronym for high-performance analytic appliance -- rapidly processes millions of database records ranging from business data to genome sequences found in cancer patients’ blood. The vast quantity of information amassed by businesses, governments and universities that requires powerful computers for storage and analysis is often referred to as big data.

SAP’s chairman, with a fortune of $10.6 billion according to the Bloomberg Billionaires Index, has had a long-running professional feud with Oracle Corp. (ORCL:US) CEO Larry Ellison. Plattner incubated Hana, which competes with Oracle, using a small group of developers near Berlin. Sales of Hana grew 61 percent last year to 633 million euros ($864 million), and are a bright spot amid slower growth for traditional software.

The shares were little changed at 57.18 euros at 3:03 p.m. in Frankfurt, valuing the company at 70.3 billion euros.

McDermott, an American who will take over as sole CEO in May, told investors in New York last week that SAP is dedicating more salespeople to online software.

Plattner, who is SAP’s biggest shareholder with a 10 percent stake, also plans to address how the new center in Potsdam can work more closely with the Hasso-Plattner Institute, an academic research body he has personally endowed.

Genome Interpretation Appliance; Designed in California, Develped in Europe & Mexico, Manufactured in Asia?

[We are not there yet - but getting very close ... Pellionisz]


Students and faculty of the Polytechnic University of Tulancingo (UPT) participated in the presentation of the Italian researcher, Anna Carbone, who taught the short course "Fractals, Hurst exponent and applications" aimed at teachers of Full Time (PTC) College.

This training was aimed at providing a broad overview of the type of research being conducted in Europe but particularly in the type of research conducted by the Polytechnic of Turin.

Academics of the UPT I said that once completed the course the opportunity to make a working visit to the PTC's in the company of Dr. Carbone Italian Innovation Center gave - Mexican Manufacturing High Technology Hidalgo (CIIMMATH) located in Ciudad Sahagun, in order to analyze opportunity areas where knowledge transfer and human resources is made, in this case of spaces for students of the Polytechnic of Tulancingo to stay working.

Additionally, the researcher Carbone also provided a specialized postgraduate students of UPT, where he provided more detail about what a fractal is a structure which is repeated at different scales of observation course information.

The course also established the mathematical basis of the Hurst exponent which indicates that occurs in several areas of applied mathematics, including fractals and chaos theory, processes of long memory and spectral analysis, the estimated Hurst exponent has applied in areas ranging from biophysics to computer networking, added another application such as in the Deoxyribonucleic Acid (DNA) in predicting disease.

In a third talk, Anna Carbone gave the Conference "The Hurst exponent: applications in Finance and Genomics" to degrees and engineering students of this university, where he explained techniques derived from statistical mechanics developed by it for 10 years, these techniques are based on the fractal analysis or measurement of the state of disorder of a specific system, exemplified in his study two applications.

Based on measurements of the method proposed by the specialist, you can determine whether a person is healthy or ill, and identify the degree of vitality of a body and the second example stated in the financial areas using the method in fractal where helps determine the market trend, ie if it is on the rise or is presented with a negative investment trend. He later made a tour of the campus facilities, for equipment that are available visit.

Téllez Gerardo Reyes, Rector of UPT, said that thanks to the linkage with international institutions and the support of the Ministry of Public Education, the renewal of agreements with prestigious foreign institutions result in the exchange of knowledge and research by teachers recognized globally, which benefits students and faculty to enlarge the picture in their areas of education, said the Polytechnic of Turin in Italy is located as the third best institution of higher level in your country and the researcher Anna Carbone is shown that the quality of university.


Anna Carbone obtained the degree of Master of Science in 1988 and a PhD in Physics in 1993, both from the Polytechnic of Turin, since 1998 he is a researcher in the Department of Applied Science and Technology at the same University, where he directs the Laboratory Fluctuations and Noise.

His research is related to the theoretical and experimental analysis of noise and stochastic processes in different systems is the author of several research articles that have been published in international journals.

A member of International Scientific Committees and editor of magazines such as: Physica A, Frontiers in Fractal Physiology, among others, also she has collaborations in various European projects

[The "Genome Interpretation Appliance" - like everything else - is likely to be manufactured in Asia, see involvement of SONY and SAMSUNG already, for the key parallel chips NVidia GPU and or Xilinx FPGA, for multi-core chips Intel and/or AMD are contenders. Global integrators like Google AND Apple (Calico), Cisco, Amazon, HP, DELL are already lined up. Chinese BGI, Boston-based Broad-affiliated Foundation Medicine, Houston-based Baylor are already deep into (cancer) applications. The "hard part" is to build the new science of Genome Interpretation by software-enabling algorithmic approach. Fractals have no reasonable alternative. Moreover, a global cooperative effort is emerging - chalk up Mexico and Europe (Italy) - Pellionisz]

Patent War-Weary Samsung Inks Cross-Licensing Deal with Cisco

Feb. 7, 2014DH Kass | The VAR Guy

Maybe Samsung is a little punchy from fighting intellectual property wars on multiple fronts worldwide. The Korean manufacturer signed a new, 10-year cross-licensing deal with Cisco, giving each company access to the other’s current patent folders and to whatever product and technology IP each records over the agreement's lifetime.

The Korean manufacturer certainly seems in a patent peace-making frame of mind now.

Samsung's latest truce overture is a new, 10-year, broad cross-licensing deal with networking giant Cisco Systems (CSCO), giving each company access to the other’s current patent folders and to whatever product and technology IP each records over the agreement's lifetime. Neither party disclosed contract terms.

In a statement, the vendors called the agreement an “important industry step to enhance cooperation by curbing unnecessary patent litigation,” more of which has encumbered Samsung than Cisco of late. The deal marks Samsung’s third cross-licensing agreement in just the past few weeks, as the device maker appears to want to remove as much wireless patent clutter from its path as possible.

In January, Samsung ended a protracted patent battle with Ericsson (ERIC) when the two heavyweights agreed to cross-license each other’s cellular technologies, effectively ending a dispute stemming from the mobile network equipment maker’s lawsuit filed late in 2012 claiming the Korean device maker had infringed its intellectual property.

And, with overtones of its Cisco IP deal, Samsung earlier initiated a 10-year, global patent cross-licensing pact with Google (GOOG) that could include thousands of patents, not only for existing technologies and businesses but also for patents filed for the life of the agreement. Samsung and Google are regarded as longtime collaborators but until now the two had not signed a formal intellectual property deal.

Dan Lang, Cisco IP vice president, said the licensing agreement with Samsung will boost innovation for future products and services and lower the possibility that the vendors might sue each other over patent infringement conflicts.

"Innovation is stifled all too often in today's overly litigious environment," he said. "By cross-licensing our patent portfolios, Cisco and Samsung are taking important steps to reverse the trend and advance innovation and freedom of operation."

Dr. Seungho Ahn, Samsung Intellectual Property Center head, said the vendor expects the IP pact will “result in mutual growth and, ultimately, for the benefit of both companies' customers across the world."

[In the prolonged era when global health-care was dominated only by "Big Pharma", the increasingly globalized mergers and acquisitions of Pharma-companies could resolve precious intellectual property issues. With the new era of "Genome-based Information Technology" the (partial) list of global IT-companies listed below is expected to grow at a feverish rate. Chalk up Cisco to the list with the present announcement of a Cisco-Samsung agreement between a US-based and a Korea-based IT giant. While such alliances might reduce the workload of lawyers, it is unlikely that they will become unemployed anytime soon, however. Consider for example that Samsung now allied itself with Cisco of the USA - but it is also allied with Google! How would this affect e.g. Calico, Inc. that is in part Google, but Samsung's arch-enemy (Apple) is also involved? Given the $Trillion ticket of "intellectual property wars" (e.g. between Samsung/Apple), it may be imperative not only to form alliances to prevent "wars flaring up", but use the rather common practice of "gathering an Intellectual Property portfolio" also in the emerging market of "Genome-based economy". The legal bill of fighting over smartphone features is likely to be dwarfed by global battles over intellectual property used by "IT-Titans" - AJP]

These 3 Tech Titans Could Revolutionize Health Care

by Leo Sun, The Motley Fool Jan 28th 2014 

When most people think of Sony, Samsung Electronics or Google, they think of video games, smartphones, tablets, or search engines. These days, however, these tech giants are taking huge steps to expand their technological capabilities beyond the everyday consumer.

Sony, Samsung, and Google have all recently made unexpected moves into the health care industry -- a high-growth market where technology is merging symbiotically with life sciences and medicine.

Let's take a closer look at some fascinating ways that these three companies could revolutionize health care with their new investments.

Sony launches a genome information platform

Sony, the Japanese conglomerate best known for its electronics, video games, and media businesses, is now interested in genome sequencing.

Genome research, which analyzes human genetic data in comparison to other medical or scientific data, is widely believed to be the key to identifying the origins of diseases and developing personalized treatments for patients.

On January 23, Sony announced a partnership with Japanese medical portal M3and life sciences giant Illumina to launch a genome information company, known as P5, in Japan by the end of February. Sony and M3 will establish P5, with Illumina acting as a minority investor.

The goal of P5 is to provide a genome analysis service to medical and research institutions across Japan, with a long-term goal of the creation of personalized medicine and health care services.

The establishment of P5 comes at a time when the costs of human genome sequencing are dropping substantially -- earlier this month, Illumina broke the "genome sound barrier" with a new machine that can sequence the entire human genome for $1,000. To put that into perspective, the same process would have cost $250,000 a decade ago, and the process on current high-end machines still costs $3,000 to $5,000. 

Illumina's HiSeq X Ten system can sequence an entire human genome for $1,000. Source: Illumina.

Since Sony has partnered with M3, it's assumed that P5 will store the collected data in the cloud, which means that individual patients could eventually access their genetic records online.

Sony CEO Tadashi Saito stated that Sony was positioning the medical business as one of the company's "key growth pillars," which would be a huge change from the status quo. Today, Sony's tiny medical business is tucked away in its imaging equipment segment (including cameras), which accounts for 10% of its total revenue.

Samsung's quiet evolution into a medical giant

South Korean tech giant Samsung Electronics has similar goals, with an ambitious plan to become of the world's largest medical equipment companies by 2020.

Samsung forecast $400 billion in annual revenue by 2020, with $10 billion (2.5%) of its top line eventually being generated by medical devices. The company's big push started with its acquisition of ultrasound maker Medison in 2010, followed by health care equipment maker Nexus in 2011 and the medical imaging company NeuroLogica in 2013. All this inorganic growth culminated in the launch of GEO, Samsung's new line of digital radiology and in-vitro diagnostic equipment.

Samsung's medical device business is still relatively small -- it generated $300 million in sales in fiscal 2012, with an expectation for sales of $500 million in fiscal 2013. However, Samsung's growth plans could put it on a collision course with the Goliaths of the field -- General Electric, Koninklijke Philips, andSiemens -- which generated combined sales of nearly $50 billion in fiscal 2012.  

Samsung Electronics also owns a 40% stake in Samsung BioLogics -- a separate company that intends to become "a world leader in biologics development and manufacturing." Biologics are medicines synthesized biologically (vaccines, antibodies, proteins) and not chemically. They include some of the best-selling blockbuster drugs today, such as AbbVie's Humira,Johnson & Johnson and Merck's Remicade, and Amgen's Enbrel.

Last October, Roche signed a manufacturing agreement with Samsung Biologics to manufacture Roche's proprietary commercial biologic medicines in Incheon, South Korea. These ambitious investments in medical equipment and biologics put Samsung Electronics in a strong position to shake up the health care industry, as it has done with smartphones and tablets over the past five years.

Google's ambitious blitz on medical devices and medicine

Last but definitely not least, search giant Google's ambitious investments in health care could shake up the industry. Google Glass, which is scheduled to be released in April, has already been considered a next-generation tool for physicians, thanks to its onboard camera and Internet connectivity. Several major companies, such as Qualcomm and Phillips, are experimenting with different medical applications for Google Glass in idea incubators.

Google also recently announced that it was developing a smart contact lens for diabetics as a replacement for the daily pinpricks necessary for glucose monitoring. Google's approach is completely new -- it tests glucose levels through tears. If the project is successful, the lens could synchronize to smartphones or other devices, similar to newer glucose monitors like Sanofi's iBGStar, and render painful pinpricks obsolete.

In the biotech field, Google created Calico (California Life Company) last September. Calico is headed by former Genentech (now a Roche subsidiary) CEO Arthur Levinson. The company's goals are incredibly ambitious, with an ultimate goal to extend the average human lifespan by 20 to 100 years and treat age-related diseases such as Alzheimer's, cancer, and heart disease.

Google clearly isn't expanding into the health care industry because of an immediate need for profit, since its core revenue is still generated by search advertising. Yet that's what makes Google's efforts all the more fascinating. With nearly $55 billion in cash and equivalents, the sky's the limit for Google's medical ambitions.

The Foolish takeaway

Sony, Samsung, and Google only represent three companies benefiting from the convergence of the tech and medical industries could revolutionize the health-care industry.

With the increased use of medical portals, smartphones, and tablets in hospitals, it's likely that other tech companies, traditionally associated with consumer or business products, will also expand into the health-care market as well. When these top minds in tech and health care pool their efforts together, medical devices and biotechnology could improve the lives of patients worldwide.

Want to learn more about the tech and health care industries?

The key is to learn how to turn business insights into portfolio gold by taking your first steps as an investor. Those who wait on the sidelines are missing out on huge gains and putting their financial futures in jeopardy. In our brand-new special report, "Your Essential Guide to Start Investing Today," The Motley Fool's personal finance experts show you what you need to get started, and even gives you access to some stocks to buy first. Click here to get your copy today -- it's absolutely free.

[As Dr. Pellionisz widely disseminated in his 2008 prediction Google Tech Talk YouTube (Is IT Ready for the Dreaded DNA Data Deluge?), not only the enlisted three global IT-leader companies moved for a pivotal position. One could easily add GE Health, Amazon, Dell, HP and chip companies Intel, NVidia, AMD, Xilinx, Altera in the USA, and Siemens, Philips in Europe. A most interesting further global player might be Microsoft - with its new CEO totally aware of the global role that software development and clinical trials mean by India (perhaps in an alliance with Tata). Since Intellectual Property of Genome Analytics plays a key role (no matter what the hardware platform may be for a "Genome Appliance" to be used in US hospital systems e.g. for precision cancer therapy), FractoGene US patent 8,280,641 in an appropriate licensing agreement is leveraged globally, for the most lucrative US market - Pellionisz, HolGenTech_at_gmail_dot_com]


Written By: Jason Dorrier

Posted: 02/2/14 7:00 AM

Illumina, the biggest maker of genomic sequencing machines, say they’ve broken the “sound barrier” of sequencing. Their latest machine can transcribe 18,000 human genomes a year for $1,000 per genome - a mark dreamed of for over a decade.

When the human genome’s three billion molecular pairs were first fully transcribed (or sequenced) in 2003, it was deemed a seminal accomplishment, but one seemingly destined to be repeated only rarely. The project’s cost totaled $2.7 billion, and the first genomes were hundreds of millions of dollars apiece.

It was around that time that the $1,000 genome was first targeted. Why $1,000? Well, it’s a nice, round, pretty much arbitrary number - a sort of mile marker set by researchers. If ever it gets that cheap, scientists thought, we’ll be able to do anything.

In the early days, many experts thought it was an impossible target, Raymond McCauley, Chair of the Biotech Track at Singularity University, recently told Singularity Hub. But as costs began falling, the $1,000 genome started looking more realistic.

In 2006, the New York Times ran an article titled, “The Quest for the $1,000 Genome.” At the time, per genome sequencing costs had declined a full order of magnitude. Yet, despite a pace on par with Moore’s Law, if progress had continued along that line, the cost to sequence the human genome would still be over $1 million in 2014.

But it’s not. Not even close. Shortly after The New York Times article was published, next-generation sequencing hit the commercial market in a big way. What had, in the beginning, seemed an almost mythical goal started to look not only like a foregone conclusion - but one that would be realized far sooner than imagined.

Traditional sequencing (or Sanger sequencing) transcribes DNA using a complex, time-consuming, and costly chemical process. Next-generation machines, on the other hand, are more like supercomputers. They work in parallel, using brute force to simultaneously read millions of DNA base pairs at a time.

In the last six years, primarily on the back of next-generation sequencing technologies, costs have far outpaced Moore’s Law, falling four orders of magnitude from $10 million at the end of 2007 to under $5,000 at the end of 2013. 

Now, owners of Illumina’s $10 million HighSeq X Ten may run 18,000 genomes a year for $1,000 a pop. That’s about a fifth of all the genomes that have been sequenced in history at a fraction of the expense. 

It’s a staggering achievement a mere decade in the making. And as McCauley notes, “$1,000 is good base cost because it’s cheaper than an MRI; it’s cheaper than a lot of CT scans or a chest x-ray in some places.

But it’s worth a moment’s pause to discuss what Illumina’s claim really means.

As the name implies, the HighSeq X Ten isn’t really one machine. It’s ten. The supercomputer analogy is instructive here too. Though there are more transistors per chip these days, the fastest computers mainly get faster by packing more chips into more cabinets—they still fill entire rooms.

Also, the cost of sequencing is a sensitive, much hyped subject. The $1,000 genome has been just around the corner for roughly two years now. Beyond flashy PR claims, cost calculations can be controversial.

In the past, firms would include the expense of chemicals but exclude key line items like depreciation and labor, sparking fierce debates online—a fact, no doubt, Illumina knows well. They were careful to include chemicals, depreciation, and labor (albeit cheap labor, according to some) in their estimate.

McCauley says, “You can tell that they’ve been working on it and waiting to get that marginal cost of chemicals and reagents down below a certain figure where the all-in cost for materials, the depreciation on the machine, and all of that, is right about at $1,000.”

Even so, he went on to tell us, the numbers haven’t been nailed down yet.

The National Human Genome Research Institute (NHGRI) publishes the most referenced graph (see above) on sequencing costs. NHGRI uses the cost of sequencing reported in research papers. That is, instead of going by a firm’s initial claim, they wait to see how the machine performs in practical use.

According to Illumina, they’ve sold machines to Korean genomics firm, Macrogen, the Broad Institute in Cambridge, Massachusetts, and the Garvan Institute of Medical Research in Sydney, Australia. The machines are due to ship the first quarter of 2014, and of course, they’ll take time to ramp up to full capacity.

There’s one more item worth noting—and this point may be somewhat lost in the chatter—you or I can’t go to a clinic to get our genome sequenced for $1,000. Not yet at least.

Machines like the HighSeq X Ten focus on sequencing for research purposes, for scientists conducting broad studies of thousands and tens of thousands of human genomes at a time. By comparing a large number of genomes, researchers can associate conditions and diseases with particular genes. This research is crucial if we are to replace DNA transcription with understanding and fluency in the language.

At $1,000 per genome, such work will likely accelerate.

McCauley notes that for a million dollars we could sequence one genome six years ago. Four years ago we could sequence ten; three years ago we could sequence a hundred; and now we can sequence a thousand.

“It’s one thing to say we’ve sequenced a single reference genome at the beginning of the century,” McCauley says, “It’s another thing to have a million genomes in a database. With this scale of improvement that becomes very reachable. And in fact, in aggregate, I think over the next couple years we are going to see a million genomes sequenced.”

But where all that data goes is an open question. A significant chunk may be tied up in proprietary, private pharmaceutical drug testing databases instead of being made publicly available by publicly-funded research groups. For the sake of faster progress, according to McCauley, the more of the latter we have, the better.

So, if practical results prove Illumina has truly claimed the $1,000 genome—what happens next? McCauley thinks next-generation sequencing is ripe to be supplanted by next-next generation sequencing. And today’s fast pace will largely continue.

“Anybody who enters into this, they’re living in dog years. They’ve got to figure out how to not only make the technology work but keep improving it at this pace of probably 4x or 5x a year—just to compete.”

And perhaps, increasingly, we’ll see a more diversified approach. Some work, for example, will focus more on speed or miniaturization.

Certain requirements, say sequencing an individual cancer genome, may prove more urgent and less concerned with ultra-low cost. Eventually, McCauley says, we’ll see handheld DNA scanners for environmental identification and security. James Bond stuff.

The current trajectory hints at such sci-fi devices, but not immediately.

Closer in, we need to get a handle on the impending flood of data. It doesn’t make sense to have a team of geneticists pouring over every line. On the path to truly clinical genomics, expect software engineers and programmers to lay the next paving stones.

“Interpretation is king from now on,” McCauley says.

["From now on, interpretation is king" - says in 2014 McCauley". Interpretation has been a Princess at least since 2008, when in a Google Tech Talk YouTube (Is IT Ready for the Dreaded DNA Data Deluge?) this genomist predicted a massive "glut" of DNA sequences, threatening an unsustainable growth of the industry of sequencing - unless "Interpretation" ramps up the demand to match the supply. Not many listened - and the result was painfully experienced when at least half a $Bn valuation of sequencing companies were lost as a result. In the last 12 months, both Complete Genomics had to be sold (deep below its real worth) for a mere $117M to CHINA (and likewise, Pacific Biosciences also had to be sold, to Life Technologies). Now it is evident that Illumina will not be able to do "analytics" alone - has just closed a deal to team-up with SONY (of Japan). SAMSUNG (of Korea) has been providing an "analytics service" for about 30 months, and SIEMENS (of Germany) is also looking carefully at the emerging market. In theUSA, the newly announced Calico, Inc. (jointly by GOOGLE and APPLE) will, by necessity, arrive at the same conclusion - that a "Genome Analytics Appliance" must be constructed, enabled by proprietary IP for analytics (such as FractoGene, US patent 8,280,641, in force till 2026 late March). As for technology, a key question is for any/all such plans is if the required HPC will deploy GPU and/or FPGA parallel processors. As for the US hospital (cancer) market, a key is the deployment and clinical trial system, see here. For further info, contact Dr. Pellionisz, HolGenTech_at_gmail_dot_com.]

LincRNA, once believed useless, plays role in genome

Medical Press
January 23, 2014

By B.D. Colen

[John Rinn in the middle - AJP]

Ever since the Human Genome Project decoded the genome, the prevailing scientific view has been that only the 2 percent that makes proteins—the building blocks of cells—was important. The rest was deemed not functional, or "junk." But from his days in graduate school, through his postdoctoral fellowship, and now as a Harvard Stem Cell scientist, John Rinn has been digging through the genome, challenging that prevailing belief.

Now, Rinn and his Harvard Stem Cell Institute colleagues, including neurobiologist Paola Arlotta, have carried out an elegant and important experiment in which they have demonstrated that much of what had been dismissed in fact plays as vital a role as protein-coding genes.

Rinn and his colleagues have generated 18 strains of mutant mice, removing from each a different piece of "junk" genome, or so-called long intergenic noncoding RNAs (lincRNAs). If the lincRNA truly had been "junk," nothing should have happened.

What the researchers found was the opposite. Obvious defects were observed in seven of the 18 mutant strains. Three died shortly after birth, and there is reason to believe many others also are defective. The authors reported these and more findings in the new online journal eLife, initiated by the Howard Hughes Medical Institute, the Wellcome Trust, and the Max Planck Institute.

Martin Sauvageau, Loyal Goff, and Simona Lodato, the lead authors on the study, say it will take years to study all the mice, but an initial global characterization determined that seven had obvious defects, including early death, inadequate size, and lung and brain defects. Others had problems that were more subtle.

"There has been a lot of skepticism whether these long noncoding RNAs are important for living organisms," said Rinn, "but you can't say this is junk without testing it. The question will always be what percentage is junk, and what's functional. But what we need to do is look at lincRNAs with the same careful focus we've applied to protein-coding genes, using genetics to characterize them, and see what role they play on a molecular and organismal scale."

Rinn and Arlotta see their collaboration as unusual and yet indicative of how the Department of Stem Cell and Regenerative Biology, in which they are both faculty members, brings together scientists of disparate backgrounds and interests.

"How do a neuroscientist and an RNA biologist come together?" Rinn asked rhetorically. "That's the story of our department: two scientists coming together to do something neither would have done on their own. And our labs have become seamless. That's something the department aims for.

"I'm a geneticist who had never worked with mice. Paola is a neuroscientist who had experience with mice, but for whom this kind of genetic experiment was new. But she has a history of being a classic neurobiologist who takes risky, successful technological leaps. This set of experiments constituted a huge risk for both of us, and for the postdocs who did the work," Rinn said.

"We got into this because we thought these [lincRNA] molecules could be another layer of explanation of how a tissue as complex and mysterious as the brain is made," said Arlotta. "And indeed we found that these are key molecules for brain development."

The lincRNAs control the stem cells that give rise to a "particular class of neurons … that are important in the expansion of the brain, in the part that controls intelligent behavior, cognition and perception, the part of the brain that makes us," Arlotta said.

"When we removed the specific lincRNA, we looked at the mouse brain and the progenitors were reduced. As a consequence probably, the population [of neurons] that sits on top of the cerebral cortex are reduced … It's likely that in the future we'll see a number of studies showing that other lincRNAs are involved in specific behaviors," Arlotta said. "The brain likes this junk RNA."

[Beyond the above journalistic coverage, the peer-reviewed science paper is found below - AJP]

Multiple knockout mouse models reveal lincRNAs are required for life and brain development

[Positive experimental evidence against the "Junk DNA" dead-end (see in full in Ohno 1972 seeding the wrong idea that most of the human DNA is there "for the importance of doing nothing"). The "Junk School of Genomics" was exactly what Mattick referred to over a decade ago as "the biggest mistake of the history of molecular biology" - AJP]

Martin Sauvageau, Loyal A Goff, Simona Lodato, Boyan Bonev, Abigail F Groff, Chiara Gerhardinger, Diana BSanchez-Gomez, Ezgi Hacisuleyman, Eric Li, Matthew Spence, Stephen C Liapis, WilliamMallard, Michael Morse, Mavis R Swerdel, Michael F D’Ecclessis, Jennifer C Moore, Venus Lai, Guochun Gong, George D Yancopoulos, David Frendewey, Manolis Kellis, Ronald P Hart, David M Valenzuela, Paola Arlotta, John L Rinn

Harvard University, United States; Broad Institute of MIT and Harvard, United States; Massachusetts Institute of Technology, United States; Rutgers, The State University of New Jersey, United States; Regeneron Pharmaceuticals Inc.,United States; Harvard Medical School, United States

DOI: Published December 31, 2013Cite as eLife 2013;2:e01749

Many studies are uncovering functional roles for long noncoding RNAs (lncRNAs), yet few have been tested for in vivo relevance through genetic ablation in animal models. To investigate the functional relevance of lncRNAs in various physiological conditions, we have developed a collection of 18 lncRNA knockout strains in which the locus is maintained transcriptionally active. Initial characterization revealed peri- and postnatal lethal phenotypes in three mutant strains (Fendrr, Peril, and Mdgt), the latter two exhibiting incomplete penetrance and growth defects in survivors. We also report growth defects for two additional mutant strains (linc–Brn1b and linc–Pint). Further analysis revealed defects in lung, gastrointestinal tract, and heart in Fendrr−/− neonates, whereas linc–Brn1b−/− mutants displayed distinct abnormalities in the generation of upper layer II–IV neurons in the neocortex. This study demonstrates that lncRNAs play critical roles in vivo and provides a framework and impetus for future larger-scale functional investigation into the roles of lncRNA molecules. - See more here.

[Half a Century after Nobel-laureate Jacobs and Monod, this spectacular experimental evidence will rapidly accelerate the advancement of the new mathematical theory of recursive genome regulation. As Mattick, a long-time pioneer of RNA interpretation expressed so emphatically, "most assumptions are wrong". An exceptional beauty of the experimental study is that it calls for a unification of neuroscience and genomics - AJP]

Sony forms genome analysis company [with Illumina] in move towards personalized medicine

The Verge,

Sam Byford on January 23, 2014 02:09 am

Sony is forming a new company to focus on genome research with a view towards realizing personalized medicine and healthcare. The venture, currently called P5 Inc, is a collaboration with M3, a medical company in which Sony is the majority stakeholder, and Illumina, a US-based manufacturer of genome-sequencing equipment. News of the tie-up between Sony and Illumina, which will be a minority investor, was first reported by Jiji Press last year.

Sony refers to the collaboration as a "genome information platform." It will provide genome analysis services for enterprises and research institutions in Japan at first, as well as aggregating genetic data with other medical data. Sony expects the business to eventually expand to individual patients by providing the same mix of genome data and medical information to allow for personalized healthcare. Genome research can be used to develop new methods of treatment and identify the origins and likelihood of diseases.

The medical sector is an increasingly important area for Sony. New CEO and president Kaz Hirai announced plans to make it a core pillar of the company's business before taking over in early 2012, and Sony has since made major investments in the field. One such example is the $644 million spent on forming a medical imaging venture with Olympus, which the company said could push forward the development of technology such as 4K or 3D endoscopes.

Pulse of J.P. Morgan 2014: Interviews with 17 biopharma execs

SAN FRANCISCO--Another J.P. Morgan Healthcare Conference is in the books, and biopharma's movers, shakers and up-and-comers have headed back to Boston, Basel and Bangalore. This year, in the shadow of the best market for biotech IPOs since 2000, there's fresh optimism that capital will continue to flow into drug developers' pockets, whether via more market debuts or through the many top-tier VCs putting together new funds.

3 biotech investment trends gleaned from JP Morgan conference

January 17, 2014 8:50 pm by Stephanie Baum

Pharma is warming up to innovation, personal medicine, particularly for oncology, is hot, and molecular diagnostics have moved from the bench to the bedside. As bullish as investors seem about biotech investments, how long is that likely to last? Those were some of the talking points at the JP Morgan Healthcare conference where metaphors likening the investment atmosphere to the weather abounded.

IPO window is still wide open but for how much longer? The strong appetite for biotech initial public offerings, in 2013, at least 34 as of the start of December, is helping to feed the bullishness for investments in continuing to roll out Atlas Venture Partner Bruce Booth points out a fun fact in one blog entry about the conference: 12 biotech companies started a roadshow this week — a record number of simultaneous biotech road shows. As he points out, ”If the companies have done solid crossover rounds already, or have really compelling assets, the offerings and their pricings should be fine. But if not, good luck to them as it’s likely to be tough.”

Crowdfunding in biotech is rising Crowdfunding has been moving beyond the familiar territory of consumer goods to healthcare. Even so, it has primarily focused on medtech and health IT tools for an audience of medical staff or consumers. But Poliwogg’s initiative with the Epilepsy Foundation and Alpha 1 Foundation indicate that 2014 could be the year it expands in biotech in a significant way, particularly the crowdfudning model. A couple of biotech companies are also using MedStartr’s platform such as Atlantic Bio Sci.

Corporate venture capital funds investing in innovation A PriceWaterhouse Cooper’s report on healthcare investment trends pointed out that biotech is in the top five for corporate venture capital investing. Edward Yu, a principal at PriceWaterhouseCoopers Health Industries practice said companies realize that if they don’t invest in innovation, they’ll pay for it longterm. Corporate venture capital is about connecting the dots of innovation. It’s in a company’s best interest to ensure the entrepreneurs they invest in have a high likelihood of success. Companies are attracted to the lore of the Silicon Valley and take the attitude, ‘Why can’t I replicate the Silicon Valley inside my company?’

Janssen Labs seems to fit that description. It’s helping Johnson & Johnson Development Corp by tapping companies with innovative technology and providing them with the resources they need to be successful. It’s a moderate gamble for a big pharma company but given the analysis of FDA drugs approved last year, big pharma’s future depends on reaching beyond their walls and cultivating entrepreneurs. Janssen Labs Head Melinda Richter said its three innovation centers have helped change the culture of Johnson & Johnson and has made the company bolder.

Eric Schmidt's 2014 predictions: big genomics and smartphones everywhere

[See video and transcript at the link of this entry]

What does 2014 hold? According to Eric Schmidt, Google's executive chairman, it means smartphones everywhere - and also the possibility of genetics data being used to develop new cures for cancer. [Here the journalist is a bit sloppy, for presently there is NO cure for cancer - AJP]

In an appearance on Bloomberg TV, Schmidt laid out his thoughts about general technological change, Google's biggest mistake, and how Google sees the economy going in 2014.

"The biggest change for consumers is going to be that everyone's going to have a smartphone," Schmidt says. "And the fact that so many people are connected to what is essentially a supercomputer means a whole new generation of applications around entertainment, education, social life, those kinds of things. The trend has been that mobile is winning; it's now won. There are more tablets and phones being sold than personal computers - people are moving to this new architecture very fast."

It's certainly true that tablets and smartphones are outselling PCs - in fact smartphones alone have been doing that since the end of 2010. This year, it's forecast that tablets will have passed "traditional" PCs (desktops, fixed-keyboard laptops) too.

Disrupting business

Next, Schmidt says there's a big change - a disruption - coming for business through the arrival of "big data": "The biggest disruptor that we're sure about is the arrival of big data and machine intelligence everywhere - so the ability [for businesses] to find people, to talk specifically to them, to judge them, to rank what they're doing, to decide what to do with your products, changes every business globally.

He also sees potential in the field of genomics - the parsing of all the data being collected from DNA and gene sequencing. That might not be surprising, given that Google is an investor in 23andme, a gene sequencing company which aims to collect the genomes of a million people so that it can do data-matching analysis on their DNA. (Unfortunately, that plan has hit a snag: 23andme has been told to cease operating by the US Food and Drug Administration because it has failed to respond to inquiries about its testing methods and publication of results.)

Here's what Schmidt has to say on genomics: "The biggest disruption that we don't really know what's going to happen is probably in the genetics area. The ability to have personal genetics records and the ability to start gathering all of the gene sequencing into places will yield discoveries in cancer treatment and diagnostics over the next year that that are unfathomably important." [Some synonyms of "unfathomable" are - according to web - "unmeasurable", limitless, or infinite - AJP]

It may be worth mentioning that "we'll find cures through genomics" has been the promise held up by scientists every year since the human genome was first sequenced. So far, it hasn't happened - as much as anything because human gene variation is remarkably big, and there's still a lot that isn't known about the interaction of what appears to be non-functional parts of our DNA (which doesn't seem to code to produce proteins) and the parts that do code for proteins. [Here the journalist is amazingly precise. The obsolete dovetailing failed axioms of "Junk DNA" and "Central Dogma" have been defeated ("Last Mohicans" in the marginalized wilderness of the disarmingly naive antiquated beliefs don't matter any more than members - if any - of "Flat Earth Society"). While all knowledgeable scientists agree now that any and all parts of the genome may relate to protein coding, only FractoGene is a coherent mathematical algorithm that the genome does this in a dual valence (parts - formerly, "genes" - directly "constructing" amino-acids, thus proteins, that is mathematically a contravariant valence, while other parts - formerly collectively misunderstood as "junk DNA" indirectly code for proteins, by means of "measuring" built proteins, that is mathematically a covariant valence. The intrinsic mathematics of the quest for hologenomic equilibrium (life) is nonlinear dynamics (fractals and chaos). Intellectual property is held as a powerful combination of US patent 8,280,641 in force till late March, 2026, intertwined with "trade secrets" representing "state of art" beyond the last CIP to said FractoGene patent (2007) - AJP; holgentech_at_gmail_dot_com]

Biggest mistake

As for Google's biggest past mistake, Schmidt says it's missing the rise of Facebook and Twitter: "At Google the biggest mistake that I made was not anticipating the rise of the social networking phenomenon - not a mistake we're going to make again. I guess in our defence were working on many other things, but we should have been in that area, and I take responsibility for that." The results of that effort to catch up can be seen in the way that Google+ is popping up everywhere - though it's wrong to think of Google+ as a social network, since it's more of a way that Google creates a substrate on the web to track individuals.

And what is Google doing in 2014? "Google is very much investing, we're hiring globally, we see strong growth all around the world with the arrival of the internet everywhere. It's all green in that sense from the standpoint of the year. Google benefits from transitions from traditional industries, and shockingly even when things are tough in a country, because we're "return-on-investment"-based advertising - it's smarter to move your advertising from others to Google, so we win no matter whether the industries are in good shape or not, because people need our services, we're very proud of that."

For Google, the sky's the limit: "the key limiter on our growth is our rate of innovation, how smart are we, how clever are we, how quickly can we get these new systems deployed - we want to do that as fast as we can." [A most important legacy of Steve Jobs may be that hiring smart and clever people, and developing systems fast practically failed him - in the competition with IBM and Microsoft since smart and clever people are a commodity, and new system development is a direct function of investment in labor. He made Apple the most valuable company in the World, ever, by deploying Intellectual Property acquisition and protection - AJP]

It's worth noting that Schmidt has a shaky track record on predictions. At Le Web in 2011 he famously forecast that developers would be shunning iOS to start developing on Android first, and that Google TV would be installed on 50% of all TVs on sale by summer 2012.

It didn't turn out that way: even now, many apps start on iOS, and Google TV fizzled out as companies such as Logitech found that it didn't work as well as Android to tempt buyers.

Since that, Schmidt has been a lot more cautious about predicting trends and changes - although he hasn't been above the occasional comment which seems calculated to get a rise from his audience, such as telling executives at a Gartner conference that Android was more secure than the iPhone - which they apparently found humourous.

["It is difficult to make predictions, especially about the future". Thus, learning from history and questioning options of the future may be more productive. Overlooking Social Media is a mistake that Eric Schmidt is taking reponsibility for - to the tune of $billions. Given that he properly assesses the implications of genome informatics & instrumentation "unfathomable" - who is the one who will take charge NOT to miss this colossal historic opportunity? Eric Schmidt is no longer the CEO of Google - Larry Page is. CEO of CALICO, established jointly by Apple and Google is led by former Genentech CEO and present Apple Chairman of the Board Apple, Art Levinson (former CEO of Genentech, now Chairman of the Board of Apple. In the USA, perhaps the above three towering key men might bear responsibility if the challenge is going to be met. -AJP]

We should be looking at a complete paradigm shift

[Have a lucky 2014 - maybe more hope for hundreds of millions of people suffering from genome-regulation diseases, most notably cancers - AJP]

On 5 December, agency director Francis Collins told an advisory committee that the NIH should consider supporting more individual researchers, as opposed to research proposals as it does now . The NIH currently spends less than 5% of its US$30-billion budget on grants for individual researchers.

The idea of funding individuals is not new. There are the HHMI awards, and since 2010, the UK Wellcome Trust, a charity based in London, has directed its funding to promising researchers, allowing them to pursue research directions without oversight.

“We’re in an era now where insights will come from people thinking through the implications of data,” - rather than simply producing data in order to secure grants. ... funding researchers for longer periods will free them from having to review and apply for so many grants, which can take up to 40% of their time

[One could argue if a "complete paradigm-shift" is even possible, albeit necessary (see video below at 21:30 "people do not like paradigm-shifts"). "Paradigm" from the Greek is like pillars upholding a construction. Removing them, without replacement, would generate the kind of collapse, the "horror vacui" that science abhors. Would one argue at all when concluding another 7-year epoch, one perhaps could footnote on science itself, rather than its perhaps rusty funding habits. This columnist pointed out in 2008 (peer-reviewed paper and Google Tech Talk YouTube) BY WHAT new new foundation (The Principle of Recursive Genome Function) should replace the obsolete twins of "the biggest mistake in the history of molecular biology"; Central Dogma and Junk DNA misnomers. The paradigm-shift is precise though as subtle as the "butterly effect" ("Can the flap of a butterfly in Brazil trigger a tornado in Texas?") - in mathematics. The "genes/junk theory" had been largely void not only of advanced, but even quite rudimentary mathematics. A very recent review of nonlinear dynamics in neuroscience and genomics, spanning from Dr. Losa's 4 Conferences on "Fractals in Biology and Medicine" to Dr. Pellionisz' FractoGene is here in full.

[Can totally determined DNA explain the unpredictable chaotic-fractal Life ? BBC video on the nonlinear dynamics of geometry of life. 7 years after the Human Genome Project 2000, ENCODE revealed that the DNA was "pervasively transcribed". It took another 7 years since 2007 for the mainstream genomics to arrive at the realization that the mathematics of hologenomics is nonlinear dynamics, "beyond the Newtonian dream of total determinisms". With 2014 we enter the era of a novel philosophy, based on the "Principle of Genomic Unpredictability" [AJP]

Russian Billionaire Announces $3M Mathematics Prize


WASHINGTON, December 14 (RIA Novosti) – Top mathematicians will be rewarded for thinking big under a new $3 million prize announced by Russian billionaire Yuri Milner and Facebook founder Mark Zuckerberg.

Milner, a self-described “failed physicist” who made his fortune in high-tech investments, told The Guardian that he wanted the new Breakthrough Prize in Mathematics to encourage people to think more deeply about life. The prize will be awarded for the first time next year.

“If you take the largest scales possible, there are a number of scientists, individuals, who operate at that scale, they think about the whole universe. I think that we focus too much on small scales as human beings, and not enough on larger scales. That’s really the problem we’re trying to address here,” he told the paper Thursday.

The new prize was unveiled at an awards ceremony in the United States for two other multi-million-dollar research prizes established by Milner. The Fundamental Physics Prize, which he founded last year, was shared between Michael Green of Cambridge University and John Schwarz of the California Institute of Technology.

Green succeeded Stephen Hawking as the Lucasian professor of mathematics at Cambridge in 2009. Hawking won the first Fundamental Physics Prize last year.

The Breakthrough Prize in Life Sciences has been awarded to six scientists for their work, and each won $3 million. Milner co-founded the prize a year ago in partnership with Silicon Valley entrepreneurs including Zuckerberg; Sergey Brin, the Russian-American co-founder of Google; and Jack Ma, a Chinese entrepreneur.

The glitzy event in California was presided over by Hollywood star Kevin Spacey. Not everyone was impressed by the largesse of Milner and his fellow billionaires, however.

One “prominent physicist” who asked not to be identified told The Guardian that the money could be put to better use, saying: “The great philanthropists of the 19th and 20th centuries, like the Rockefellers and the Carnegies [or Stanford, nearby to the event - AJP], did not create prizes – they created universities and research institutes that have enabled thousands of scientists to make great breakthroughs over the succeeding decades.

“By contrast, giving a prize has a negligible effect on the progress of science. A few already well-recognized people get enriched, but there is little value added in terms of the progress of science compared with the multiplier effect of creating new institutions for scientific research.”

"The Last Mohican" of "Junk DNA" Obsolete Axiom

It could have been just an empty exercise to try to "avoid losing face" by facts coming down on him as a mega-ton of bricks, but a post-retierement age yet still active professor (of Biochemistry, not Genomics, and of course not in the USA but tucked away in a sparsley populated outskirt) could still be found late 2013 (December 30) with the rather laughable obsolete claim "Knowledgeable scientists agree that most (~90%) of our DNA is junk ". Since endengared species could disappear without notice, readers are encouraged to report to Holgentech_at_gmail_dot_com if in the coming up 2014 we'll find any professor not retired at relatively reputable universities/departments where obsolete teaching might still consume precious resources, occasionally even public tax monies. [AJP]

[Breakthrough Towards Deciphering Genomic Code of Life is Likely to Come from Mathematics] Mark Zuckerberg and Yuri Milner announced new $3 million Breakthrough Prize in Mathematics

[Breaking the Genomic Code of Life is more Likely to Come from Mathematics (Physics, Biophysics and Information Theory) Compared to Old Axioms Limited to Biochemistry. Yes, this made some biochemist(s) mad as hell - but on the doorstep of 2014 due course of science appropriately defeated the crazy misconception of junk-genomics (using the phrase from Mattick) - AJP]

2014 Breakthrough Prizes Awarded in Fundamental Physics and Life Sciences for a Total of $21 Million

SAN FRANCISCO, Dec. 13, 2013 /PRNewswire/ -- The names of the 2014 Breakthrough Prize winners in Fundamental Physics and Life Sciences were unveiled at an exclusive ceremony at the NASA Ames Research Center, Mountain View, CA. At a total awarded amount of $21 million, sponsored by Sergey Brin & Anne Wojcicki, Jack Ma & Cathy Zhang, Yuri & Julia Milner andMark Zuckerberg & Priscilla Chan, the prizes aim to celebrate scientists and generate excitement about the pursuit of science as a career.

The Breakthrough Prize in Fundamental Physics recognizes transformative achievements in the field of fundamental physics, with a special focus on recent developments. The 2014 winners are:

Michael B. Green, University of Cambridge, and John H. Schwarz, California Institute of Technology,for opening new perspectives on quantum gravity and the unification of forces.

The Breakthrough Prize in Life Sciences recognizes excellence in research aimed at curing intractable diseases and extending human life. The 2014 recipients are:

James Allison, MD Anderson Cancer Center for the discovery of T cell checkpoint blockade as effective cancer therapy.

Mahlon DeLong, Emory University for defining the interlocking circuits in the brain that malfunction in Parkinson's disease. This scientific foundation underlies the circuit-based treatment of Parkinson's disease by deep brain stimulation.

Michael Hall, University of Basel for the discovery of Target of Rapamycin (TOR) and its role in cell growth control.

Robert Langer, David H. Koch Institute Professor at the Massachusetts Institute of Technology for discoveries leading to the development of controlled drug-release systems and new biomaterials.

Richard Lifton, Yale University; Howard Hughes Medical Institute for the discovery of genes and biochemical mechanisms that cause hypertension.

Alexander Varshavsky, California Institute of Technology for discovering critical molecular determinants and biological functions of intracellular protein degradation.

"Scientists should be celebrated as heroes, and we are honored to be part of today's celebration of the newest winners of the Breakthrough Prize in Life Sciences and the Fundamental Physics Prize," said Anne Wojcicki and Sergey Brin.

The prize ceremony was hosted by actor Kevin Spacey, and awards were presented by the Prize sponsors and by celebrities including Conan O'Brien, Glenn Close, Rob Lowe and Michael C. Hall. The event was organized in cooperation with Vanity Fair and produced and directed by Don Mischer, the producer and director of the Academy Awards, among other television and live events. Grammy-nominated singer Lana Del Ray performed live for the guests of the ceremony.

The event will be televised by the Science Channel, one of the Discovery networks; it will be broadcast at 9pm on January 27th.

At the end of the ceremony, Mark Zuckerberg and Yuri Milner announced the launch of a new $3 million Breakthrough Prize in Mathematics. The details of the new prize will be announced at a later date.

"The Breakthrough Prize is our effort to put the spotlight on these amazing heroes. Their work in physics and genetics, cosmology, neurology and mathematics will change lives for generations and we are excited to celebrate them," commentedMark Zuckerberg.

Yuri Milner said: "Einstein said, Pure mathematics is the poetry of logical ideas. It is in this spirit that Mark and myself are announcing a new Breakthrough Prize in Mathematics. The work that the Prize recognizes could be the foundation for genetic engineering, quantum computing or Artificial Intelligence; but above all, for human knowledge itself."

This commitment to the pursuit and dissemination of knowledge is not limited to the Prize ceremony. On December 13, there will be two Breakthrough Prize Symposiums: at Stanford, on the Future of Fundamental Science; and at the University of California, San Francisco, on the Future of the Biological Sciences. Winners of the Breakthrough Prize from 2012, 2013 and 2014 will give lectures and take part in panel discussions before an invited audience.

Art Levinson, the chairman of the Breakthrough Prize in Life Sciences Foundation, said: "We are honored to recognize such an outstanding group of scientists as this year's Breakthrough Prize Laureates. We are sure they will continue to push back the boundaries of knowledge in the years to come."

About the Breakthrough Prizes

The Breakthrough Prize in Fundamental Physics and Life Sciences are founded by Sergey Brin & Anne Wojcicki, Jack Ma &Cathy Zhang, Yuri & Julia Milner and Mark Zuckerberg & Priscilla Chan. The prizes aim to celebrate scientists and generate excitement about the pursuit of science as a career. Breakthrough Prizes are funded by a grant from Sergey Brin and Anne Wojcicki's foundation, The Brin Wojcicki Foundation; a grant from Mark Zuckerberg's fund at the Silicon Valley Community Foundation; a grant from Jack Ma Foundation; and a grant from Milner Foundation. Laureates of all prizes are chosen by Selection Committees, which are comprised of prior recipients of the prizes.

The Selection Committee for the 2014 Breakthrough Prize in Fundamental Physics included:

Nima Arkani-Hamed

Lyn Evans

Fabiola Gianotti

Alan Guth

Stephen Hawking

Joseph Incandela

Alexei Kitaev

Maxim Kontsevich

Andrei Linde

Juan Maldacena

Alexander Polyakov

Nathan Seiberg

Ashoke Sen

Edward Witten

The Selection Committee for the 2014 Breakthrough Prize in Life Sciences included:

Cornelia I. Bargmann

David Botstein

Lewis C. Cantley

Hans Clevers

Napoleone Ferrara

Titia de Lange

Eric S. Lander

Charles L. Sawyers

Bert Vogelstein

Robert A. Weinberg

Shinya Yamanaka

Additional information on the Breakthrough Prizes is available at:

Media Contacts

Brunswick Group:

Oliver Phillips

+1 415 671 7676

Prize Foundations:

Leonid Solovyev

+44 7590 976 33

23andMe to only provide ancestry, raw genetics data during FDA review

By MICHELLE CASTILLO CBS NEWS December 6, 2013, 3: 33 PM

Genetic testing company 23andMe said Thursday it will still allow consumers to access ancestry-related information and raw genetic data without interpretation while complying with a Food and Drug Administration regulatory review.

23andMe added in a statement that it will continue with the company's educational projects and research using genetic and phenotypic data in their database.

“We remain firmly committed to fulfilling our long-term mission to help people everywhere have access to their own genetic data and have the ability to use that information to improve their lives,” Anne Wojcicki, co-founder and CEO of 23andMe, said in the press release. “Our goal is to work cooperatively with the FDA to provide that opportunity in a way that clearly demonstrates the benefit to people and the validity of the science that underlies the test.”

23andMe sold $99 genetic testing kits that allowed consumers to send in saliva samples. Laboratories then tested the sample, and the company provided detailed information based on the DNA analysis. This included ancestral origins, what traits the person was likely to have and what diseases they were more prone to developing.

On Nov. 22, the FDA asked 23andMe to stop selling the Saliva Collection Kit and Personal Genome Service. The FDA has restricted the company from providing health-related genetics tests while they are being reviewed.

The company said in marketing materials that the kits could provide reports on 254 diseases and conditions, but the FDA noted that the product was not approved by the government for those purposes.

The FDA was especially concerned because the results could cause consumers to make difficult, life-changing decisions on high-risk health issues, even though there was a possibility for a false positive or negative test result. The FDA was also troubled by the medication sensitivity tests because customers may be more likely to self-medicate or change their dosage without consulting with a medical professional.

Customers who were provided with health-related results or those who purchased kits before Nov. 22, 2013 will still have access to or will be given health-related results. Those who purchased kids after this date will only get ancestry information and raw genetic data without any form of interpretation. It is possible that they will get health-related results at a later date, pending FDA approval. Customers who bought a kit on or after Nov. 22, 2013 can receive a refund, and will be notified by email about how to go about receiving their money back.

A class action lawsuit was filed in the U.S. district court of California on Dec. 4 claiming that the company falsely marketed their kits, and have not proven that they provide scientifically-accurate results.

In October, critics spoke out about a patent on a product that 23andMe filed,called the Family Traits Inheritor Calculator. The product would allow consumers to send in a saliva to see what genetic traits and diseases their potential offspring could have. The application said that the company was seeking to use the item in fertility clinics. It could mean parents would be able to pick their children based on genetic superiority, what critics called "designer babies."

The company later clarified that while at the time they filed the patent they were considering practical applications in the fertility clinic, they have since decided to go in a different direction.

An article in Scientific American has also raised concerns that people who submitted their samples to 23andMe sign away their rights to their personal genetic data, allowing the company to share information about consumers’ genomes to other companies. This could potentially lead to more targeted consumer medicine, especially from insurance and pharmaceutical companies that would know a person’s “weaknesses.”

23andMe said it would not sell information without explicit consent from the consumer.

[Anne Wojcicki, "who picked up the Genome Baby left on the doorstep of IT", see at 6:10 of Google Tech Talks YouTube 2008 , is not the type who'd ever abandon her baby. The video shown in the December 6th article (aired November 7th) showed among its goals to "Revolutionize Healthcare". This may have rubbed Conservatives the wrong way, even triggering some into chaos. With the Solomon wisdom of dropping health advisories (for a while) and going on with the rest, she clearly wins this round with the FDA.

Notes for the two key sentences of the write-up: "On Nov. 22, the FDA asked 23andMe to stop selling the Saliva Collection Kit and Personal Genome Service. The FDA has restricted the company from providing health-related genetics tests while they are being reviewed." Well, the first sentence is how it happened by a bureaucrat already on record of mis-step (he declared in his letter "the 510(k)s are considered withdrawn, see 21 C.F.R. 807.87(1)"). The second sentence may be attributed to Dr. Margaret Hamburg, M.D., the Commissioner for FDA a number of days later (Dec. 3), when she may have realized the enormity of global business- and geopolitical implications of what appears to be another mis-step, and thus affirmed in Wall Street Journal a pledge by her subordinate [FDA and 23andMe] "remain committed to continuing our ongoing dialogue with the company in order to bring a safe, effective and trusted product to the market".

Be that as it may, the pledged "panel" and/or "dialogue" seems extremely unlikely to conclude prior to the "deadline" of December 16 set by "Warning Letter". It is not only difficult but probably meaningless to guestimate bureaucratic damage to Holiday Business of 23andMe, Inc. For any amount appears incomparably small to the illegal drug traffic measured as US $321.6 billion in 2003 (a decade ago, by all means probably much more damaging as 2014 nears).

Two competing trends have been de facto set into motion. Anne Wojcicki is not only committed to her "Genome Baby", but her conduct is exemplary - in retrospect reaffirms having been declared by "Fast Company" as "The Most Daring CEO of 2013". On the other hand, chaotic regulation accelerates "Globalization of Genome Industry", in this case with "third party genome analytics" where the US can only minimize damage if it is balanced by "US mover & shakers who invest locally but operate globally".

Let's make it very clear. Globalization did not start with the ongoing "episode of mis-communication". Roche established its subsidiary in Bangalore in 1956. Merck bought in 2006 SiRNA for $1.1 Bn in San Francisco, but shut down the SF facility in July 2011 - and almost simultaneously, moreover nearly exactly two years ago this day (2011 December 12) forked over $1.5 Billion dollars to its "research facility" in Beijing.

The reasons are not just bureaucratic. Application of a not necessarily effective regimen of drugs generates much more profit than a single "bulls-eye" (genome-based) precision medicine. Thus, innovation may accelerate to flee to countries where the reverse is true, since health-care is provided as a service (that needs to be both effective and of minimal costs) rather than business, particularly if the country not only recognizes but acts upon the strategic vital importance of genomics.

Need to list? One does not think so.

We could narrow down the list, however, to specific countries with a particular heritage in fractals (Lithuania, Poland & France [Mandelbrot ], France [Perez], Russia [Grosberg], The Netherlands [Ruis], Germany [Heinz-Otto Peitgen], Switzerland [Losa], Australia [Barnsley, Jelinek], Japan [Hokusai art], Korea [Dragon Temple]. China, with its most centralized government, is a category in itself [with its "fractal dragons"], while arch-rival India [that abounds with fractal architecture] might be moved in a coherent fashion by well-proven Industrial Groups [with genome-based generics to be made most competitively cost-effective for service-based health care]. The USA [Grosberg, Lieberman, Pellionisz, Shaw, etc.) can use USA/Global leveraging skills].

What the future holds, is anybody's "weather prediction" entering a non-linear dynamics mode. The state of art of genomics is definitely fragmented (fractal) and genome regulation seems chaotic.- AJP]

Fractal Genome and its Chaotic Regulation?

The news of "Warning Letter" by FDA to 23andMe, causing their marketing & sales to "cease and desist", plus a more recent Class Action law-suit against them (separately, but connected in a convoluted fashion) altogether created a veritable explosion in the cyber-space.

While the visible and the underlying (deeply entangled) issues are singularly complex, up to and including global strategy and balance, this columnist focuses on the perhaps not so widely understood science (an outstanding legal analysis is here). After all, it is interesting when science and the people's thinking about it is challenged by laws without a totally convincing mandate.

Dr. Guiterrez, who signed the "Warning Letter" from FDA, received a bachelor's degree from Haverford College and master and doctorate degrees in chemistry from Princeton University. With eminent science degrees in chemistry, or others having earned a Ph.D. in physical chemistry at Yale, Dr. Guiterrez' word "chaotic" (below) must be taken very seriously when he says in his YouTube Panel at PMWC earlier this year: "what is the state of Personalized Medicine regulation?" and he answers "the best they can say is "chaotic". (0:39)

"Chaotic" regulation of fractal genome?

The fact that the genome is fractal is paradigm-shift with escalating acceptance, including the President's Science Adviser putting it on Science cover, and lately heralded by some as prize-worthy.

How can static laws (that must provide predictability at least in a "piece-wise linear" manner as they are changed only in leaps and bounds from time-to-time) regulate fractal genome in a chaotic fashion in what is supposed to be a legal "steady state"?

Without getting unduly mathematical here chaos and fractals are collectively called as the branch of "nonlinear dynamics". "Non-linearity" without question results in a lack of trivially transparent predictability. For instance, scientifically astute scientists with degree in chemistry are familiar with the Belousov-Zhabotinsky reaction, or BZ reaction, which serves as a classical example of non-equilibrium thermodynamics, resulting in the establishment of a nonlinear chemical oscillator [1959-1964].

People from all walks of life are intimately familiar with another, much easier to understand example than the BZ reaction. It is the grossly nonlinear dynamics - of the weather! We know and understand that the weather is not totally, certainly not trivially predictable. Moreover, it is rather unyielding to regulatory efforts by government(s) - and since those often act in rivalry, they can further aggravate complexity. Weather reports fall into the science of meteorology. Let us remember Mandelbrot's fractals again "clouds are not spheres and even lightning does not go along a straight line". Talking about him, most everybody even with a remote familiarity with fractals would "visually recognize" a "Mandelbrot Set". However, a bit like the epoch-making "Heisenberg Principle of Uncertainty", one can either go for the exact mathematical description of the ideal infinite set comprised by the rather simple equation of Z=Z^2+C. Alternatively, one may wish to deal with a particular instance of it, where the number of iterations in the real world are always finite, but show individual diversity in both the numbers of iterations and specific fractal defects of the instance. The number of particular iterations could be too few (like the weather "partly cloudy"), or fuller ("mostly cloudy"). Though the fractal equation holds for any set (that does not contain "fractal defects"), even the best experts might find it impossible to call exactly how many iterations (along with the much more difficult question "what fractal defects") any particular instance of the same set is made of.

The 21st Century science of genomics is nascent. Its industrial applications (in health care, agriculture, energy, bio-defense etc) are both pioneering as well as sometimes quite understandably not public, and thus a proper understanding of modern genomics is largely absent in the society at large. Even the new axioms of a novel philosophy ("the genome is not your destiny", "ask what you can do for your genome") are at the dawn at a new era. Not totally unlike the paradigm-shift of quantum mechanics of nuclear physics that once already changed history and the world.

Since no single person may be entitled to arbitrate perhaps the greatest paradigm-shift of science/technology, ever, this realization may be the reason that Dr. Guiterrez at 37:47 of another of his public appearances called for a "panel" in cases of dispute, if any:

[Transcription is somewhat chaotic, but the concluding pledge seems very clear] "I guess the only thing I want to say is that even the guidance that one would put on companion diagnostics has flexibility, I don't think that you know there are forwarded by drug companies that would clearly increase the risks for them I don't deny that and the other thing is that I am not sure that I have seen too many uses where there is data to back them up that would I don't know I am not aware of any that the FDA would not allow and where is a dispute we are happy to take it to a panel"]

It may be a tall order at best to convene an effective panel of experts (particularly in the nonlinear dynamics of genome function) in the remaining days that can be counted on a badly injured single hand of the 15 workdays of "cease and desist". A hastily assembled Congressional Hearing, similar to a past "non-scientific regulation of science" , appears even more unlikely in a holiday panic-mode that might not necessarily be constructive or even a cost-effective use of tax dollars.

Thus, essentially there remain two likely alternatives - resulting in a most likely third outcome. One is a hyper-escalating and self-contradicting chaotic oscillation of "regulation"-attempts. Second, "a mountain of labor may give birth to a mouse". Whenever (it takes time) the US voters will realize that aspirin (and other NSAID-s) not only do not need prescription, but their mortality according to FDA's own study exceeds, by orders of magnitude, anything ever seen with "genetic tests" (where even the name, instead of "genomic" shows insufficient understanding), all the taxpayer-dollars spent on "chaotic regulation of fractal genome by scientifically self-contradicting legalisms" might be replaced by a very inexpensive label* (already on aspirin, NSAID-s, saccharin, let alone myriads of additives and supplements): "*This statement has not been evaluated by the Food and Drug Administration. This product is not intended to diagnose, treat, cure or prevent any disease".

Short of the obvious, a likely scenario is that the industry of 23andMe, a jewel of Silicon Valley, "Innovation of the year (2008)" could go down the river how another Silicon Valley jewel, Complete Genomics, became a presently a wholly owned subsidiary of BGI (with Chinese government subsidy in the billion dollar range).

Presently, China owns 80% of the World's human DNA sequences (with the USA and the rest of the World sharing the 20%), and the USA is at the brink of falling second to China in R&D . Thus, it is not inconceivable that International Airports (already famed Hong-Kong and Moscow, followed by Mexico City, Hyderabad, Bangalore, Dubai, Singapore, Delhi, Seoul, Tokyo, Toronto, Vancouver, Amsterdam, Zurich, Frankfurt, Berlin, Vienna, etc., etc.) could easily become enriched by saliva and other DNA sampling booths. US-based movers & shakers who invest locally and operate globally still have a chance to turn this around - or outright foreign parties can simply buy US patents (and thus monopoly-rights to the lucrative US markets) and laugh all the way to their bank. (China decided to shop a couple of $Trillions worth of USA property, intellectual or not).

This commenter hates to make it sound worse (but listening to a 2008 Google Tech Talk YouTube could have saved billions of dollars of lost valuation). The US company, should it be acquired with a customer-base already on record to have actually paid for genome analytics, in itself is a treasure. However, along with their already substantial primary data-set it is already sufficient for use -or abuse- far beyond the intended noble purposes of 23andMe. An M/A may extend analytics to be based on full DNA sequencing (where the announcement of freshly FDA-approved Illumina full sequencer surrounded with deafening silence, except a very notable exception).

Since homo sapiens actually has 2x23 plus a mitochondrial DNA, an M/A entity might be newly named as 47andWorld.

China and Russia already openly declared the strategic importance of genomics. Unless the publicly pledged "panel" is pre-empted by swift administrative action to defuse a danger to the (literally) vital interests of the USA, strategic world-balance will never be the same again. [Comment by Pellionisz, HolGenTech]

Gates Foundation to Double Donation to Fight AIDS

Washington Wire

December 2nd, 2013

[The richest person in the World by software looks pleased in the government office of the most able clinical genomics mover & shaker, when US R&D is about to fall second to China - AJP]

BETHESDA, Md. — Billionaire philanthropist Bill Gates said he plans to nearly double his foundation’s contribution to the Geneva-based Global Fund to Fight AIDS, Tuberculosis and Malaria, to as much as $500 million. Coupled with matching grants from other donors, Mr. Gates and his foundation’s officials said this could mean a total $1.6 billion contribution to the Global Fund.

The new money follows management troubles and a leveling off in funding for the Global Fund amid difficulties in the economies of many nations that contribute to it. Mr. Gates, who also spoke of worldwide health milestones that have been achieved by various groups including the Global Fund and his own foundation, made his remarks in a round-table discussion with news reporters preceding a lecture he gave at the National Institutes of Health.

The co-founder of Microsoft Corp., now a trustee and co-chair of the Bill & Melinda Gates Foundation, Mr. Gates said the commitment of various groups to fight diseases has had a measurable positive impact on lowered infant mortality and disease deaths around the world. Yet he said one impediment in that fight is the U.S. mandated budget cuts, known as the sequester, or sequestration.

The sequestration will force another $600 million reduction in federal funding for the NIH in January. The will cut the total $29.1 billion budget from 2013, which itself was slashed by the sequestration from a rate of nearly $32 billion a year ago. The sequestration was a mechanism created by congressional and White House negotiators to force strict cuts across the federal government when the two sides were unable to reach a more nuanced, negotiated budget level.

For the NIH, it means more limitations on what it can donate to medical research, both in the U.S. and abroad. The NIH currently funds about 40% of worldwide medical research and development outside the U.S., while the Gates Foundation pays for another 17%, according to Mr. Gates and NIH Director Francis Collins.

“Sequestration is a serious problem for the NIH in research,” said Mr. Gates. “It’s really a crisis where universities will have to look at their own infrastructure” and begin cutting back on their own research. Dr. Collins, in the roundtable discussion, agreed that sequestration “was the stupidest form of fiscal management.”

Mr. Gates said he is “deeply disappointed” in mandatory cuts in the NIH budget, since so many efforts against worldwide disease and malnutrition “are all long-term ventures,” the fruits of which won’t be seen for a while. Even so, he said, the Gates Foundation and other health groups have brought down the rate of infant death in recent decades from about 20 million annually to about 6.5 million now.

He said his goal is to lower that figure to below three million within 15 years, through vaccinating children worldwide and through efforts to reduce malaria, AIDS and rotavirus-caused diarrheal disease, among many others.

FDA warns Google-backed 23andMe to halt sales of genetic tests


Mon Nov 25, 2013 5:11pm EST

(Reuters) - The U.S. Food and Drug Administration has warned 23andMe, a company backed by Google Inc, to halt sales of its genetic tests because they have not received regulatory clearance.

23andMe, which was founded in 2006 by Anne Wojcicki, sells a $99 DNA test that the company says can detect a range of genetic variants and provide information about a person's health risks. Wojcicki recently separated from her husband, Sergey Brin, a co-founder of Google.

In a warning letter dated November 22 and released on Monday, the FDA said products that are designed to diagnose, mitigate or prevent disease are medical devices that require regulatory clearance or approval, "as FDA has explained to you on numerous occasions."

The privately held company, which is based in Mountain View, California, acknowledged receipt of the letter and said in a statement that "we recognize that we have not met the FDA's expectations regarding timeline and communication regarding our submission."

The FDA said some of the intended uses of the company's Saliva Collection Kit and Personal Genome Service (PGS) are particularly concerning, including risk assessments for certain cancers.

The agency said false positive tests for certain breast or ovarian cancers could lead a patient to undergo preventative surgery including mastectomy, intensive screening or other potentially risky procedures. A false negative could result in a failure to recognize and act on an actual risk.

23andMe will not be able to sell its tests for medical purposes until it submits the necessary data.

The FDA has not cleared any genetic tests that are offered directly to consumers.

Kathy Hudson, deputy director for science, outreach, and policy at the National Institutes of Health, said the FDA action clarifies its expectations for direct-to-consumer genetic testing. "NIH believes genetic information has a great potential to improve human health, but there need to be reliable, validated tests," she said.

One concern is that the results of genetics research, especially that linking a DNA variant to the risk of a particular disease, might apply to some ethnic groups but not to others. As a result, a consumer might think she has an elevated risk of some illness when in fact she does not.

On its website "the company quotes numbers for risk from published scientific papers, but you'd have to be pretty sophisticated to know that if the study was done on western Europeans it might not be relevant to you if you're Chinese," said geneticist Dr Jeff Murray of the University of Iowa and president of the American Society of Human Genetics (ASHG).

The FDA said in its letter that 23andMe had submitted applications in July and September of 2012 for several uses of its saliva test but had failed to address issues raised by the agency or to provide additional information requested. As a result, the FDA said, the applications "are considered withdrawn."

The company said its relationship with the FDA is "extremely important to us and we are committed to fully engaging with them to address their concerns."


Dr. David Agus, a professor of medicine and engineering at the University of Southern California and founder of Navigenics, one of the first personal-DNA testing companies, said the FDA's letter to "is not a death knell to personal DNA testing" but should be a wake-up call. "We have to be transparent with consumers about what sequencing their genome can and cannot reveal," he added.

Navigenics was acquired last year by Life Technologies Corp.

The FDA said it had been "diligently working" to help 23andMe comply with the law, and spent significant time evaluating the intended uses of the DNA-testing product. It said it provided detailed feedback to the company through more than 14 face-to-face and teleconference meetings, hundreds of email exchanges, and dozens of written communications.

"However, even after these many interactions with 23andMe, we still do not have any assurance that the firm has analytically or clinically validated the PGS for its intended uses," the FDA said.

While 23andMe may not have been communicating with the FDA, Wojcicki has been talking at length to the media. Earlier this month she told the New York Times that her company had mapped the genotypes of 475,000 people over the last five years and expected to "hit a million" in the first quarter of 2014.

In a recent article in Fast Company, Wojcicki said her ultimate goal was to sign up 25 million people. "Once you get 25 million people, there's just a huge power of what types of discoveries you can make," she said.

The company name refers to the 23 pairs of chromosomes that make up each individual's genome.

After years of trying to obtain from 23andMe the information it needs to ensure the tests are accurate, the FDA appears to have finally lost patience.

"I think this will certainly grab the attention of a lot of other companies out there," said Joseph McInerney, executive vice president of the ASHG.


[Comment by AJP - FDA (Food and Drug Administration) is admittedly a "US Government Branch". Nobody ever claimed that (any) "government" need to excel in science. In general, they do not. Simply because their forte is not there. Whenever an important-enough issue arises in history, they hire top scientists (see e.g. Oppenheimer, Teller, von Neumann hired by US government, or the UK government hired Information Theory expert Shannon, to successfully break the war-code of the enemy). Occasionally, when free enterprise calls for science expertise, they do their own hiring (e.g. Bill Joy was hired by KPCB). This columnist did both types of advisorship. Hired as a National Research Council Senior Associate, was assigned to NASA Ames Research Center (a US Government-branch) to cope with the Artificial Intelligence-to-Neural Net science paradigm-shift. On other occasions, hired by New York University Medical Center, successfully fought the USPTO to accomplish issue of Patent on the utility of the Biological Neural Net of the Cerebellum (now expired). Recently, in the role of advising Patent Lawyer Office, guided 2002 FractoGene patent application to issued patent 8,280,641 (in force till late March, 2026). In all above cases, science won over objections of bureaucrats that could have been technically legitimate, but were not necessarily scientifically valid (recalling an incident some years ago, a government-branch launched an attack on scientists, but unguided made the mistake of branding their own attack "not scientific"). In the opinion of this columnist, the art is how to make science what it is, the decisive issue. Take, for instance "weather advisory". Any native can hold a wet finger or two in the wind and "predict the weather". Scientists would, however, brush such "advisory" aside - by pointing out the apparent lack of science-depth. "Meteorology" can be seen as "wet finger", "prediction", "advisory", etc, etc, but none really matters, since ultimately meteorology *is* a science. Anybody ever legally challenged some "weather report"? Not only it looks silly, but modern "weather advisories" rely on massive computing power based on mighty algorithms, based on statistics and probability theory. In any court of law, the ultimate decisive factor is the power of expertise of science advisers that the respective parties employ. Those (on either side) who demonstrably can not properly round a real (non-integer) number would probably not measure up very well e.g. in statistics and probability theory of nonlinear dynamics that e.g. "genomics" (not some obsolete "genetics") truly calls for. - Pellionisz]

Mandelbrot, the Genius of Fractals, Born 89, Passed Away 3 Years Ago

[Comment by AJP] - In the past six weeks the World celebrated the birth and the death past 85 of "The Father of Fractals", Benoit Mandelbrot. His last video-interview below was made just 19 days before his passing (on October 14, 2010, now over three years ago, of what had started as pancreatic cancer). Perhaps the most touching tribute to his genius came from his own words (at 1:06). "As if a curtain opened" he said when he saw what so many others looked at before him - but others did not understand what was "obvious" to him. A true genius.

[see IBM Memorial Tribute article with YouTube here]

Having known Benoit, one finds perhaps the most moving moment of his last interview lurking behind what he did not say. Due to his most celebrated book "The Fractal Geometry of Nature" Mandelbrot was keenly aware that his advances would leap over Euclidean or even Riemannian geometries. His quest was to unify metrical and non-metrical spaces (in the sense of fractal Weyl laws). Such mathematical underpinning is universal in "Nature". He clearly understood that with time, the fractal/chaotic approach will transform the mathematical infrastructure of all natural sciences. Thus, when he did say about his "Eureka moment" at 1:06 "as if a curtain was lifted", one can not help being hard-hit by the next moment of the video. His joy visibly turned into a choked deep sadness. Was it his feeling, for years, of realization that the time of his "final curtain falling" might be so near? Was it the sorrow that he would have to leave entire fields that he already sowed to reaped by next generations?

One would think so.

Geometrization of biology, ever since Descartes [Fig.1. link to Springer] had always been the unfinished dream. See more recently Schroedinger (What is Life?, 1943) and Von Neumann (The Computer and the Brain 1955). Janos von Neumann is particularly relevant for me when remembering Mandelbrot, since von Neumann's book, by pointing out the lack of intrinsic mathematics of brain function was my intellectual imprinting. I never could meet von Neumann - but as Janos invited Mandelbrot to Princeton as his last PostDoc, Benoit and I amply exchanged our mutual recollection how von Neumann cardinally influenced our respective scientific careers.

Some may wonder if Mandelbrot was involved in geometrization of biology in a "hands-on" manner. Fact is (mentioned in his Memoirs), Mandelbrot was offered an early chance (and money) to lead mathematization of biology. However, he declined as he deemed that "biology was not ready". I deeply believe should von Neumann lived another couple of decades (easily, as he passed away by cancer 54 years young), my first paradigm-shift from AI to biological neural nets would have happened up to half a Century earlier. Likewise, should Mandelbrot have been in the position to penetrate biology, "Genomics turned Informatics, Leroy Hood (2002" would now be decades ahead of the present time. Indeed, Mandelbrot reached his first tenured position at age 75 as "Sterling Professor of Mathematics at Yale" by the year 2000 (when Ohno, with his misnomer of Junk DNA was still alive), only about the time the Human Genome Project readied Genomics to to Informatics.

Was Mandelbrot aware at all of my FractoGene approach (filed in 2002)? Certainly! See photos below taken at Stanford in the summer of 2004, where I handed over to Benoit a copy of my 1989 paper (on fractal Purkinje neuron, not one but two citations of Mandelbrot and personally dedicated to him). I also gave him a CD with some essentials (after the USPTO submission in 2002), with my "Eureka Figure" (original figure of my 2002, 2003, 2007 USPTO application) on the cover.

His question was not about science (particularly, since Francis Crick, with his misnomer of "Central Dogma" was still alive). We both knew that in fact he seeded it, as I properly cited Mandelbrot in my 1989 paper dedicated to him. On page 162 of his book "The Fractal Geometry of Nature" Mandelbrot mused: "NEURON BRANCHING. The Purkinje cells in mammalian cerebellum are practically flat, and their dendrites form a plane-filling maze... the notion that neurons are fractals remains conjectural". So went the two short paragraphs (with my side-note that the cerebellum was not a "mammalian" invention of Nature, as it appeared about 400 million years ago for the "vertebrate" fish of sharks). Mandelbrot's musing resulted in my 1989 paper "Neural Geometry: Towards a Fractal Model of Neurons"

During the 2004 Stanford conference we could see Mandelbrot study the 1989 paper and my FractoGene Hallmark:

Mandelbrot quickly sized up the utility of genomic and organismal fractals put into the light of "cause and effect" view. Thus, his dominant (telling) question was "who supports you?". From his Memoirs we learn that von Neumann supported him directly at Princeton Institute of Advanced Studies for a year, but Neumann also assigned top "movers & shakers" for later times to look after Mandelbrot. Since revolutions have an appetite to eat their children, Mandelbrot's "perilous" revolution was rescued by Princeton, IBM and finally by Yale.

Mandelbrot, like in most of his lectures, told the audience of his uncanny ability "just to look at most any rough picture, and tell its fractal dimension-number with a high degree of precision". Thus, I publicly asked in his plenary lecture only one question. "Prof. Mandelbrot, given your uncanny ability, what do you think the fractal dimension-number of the genome is?"

His answer was extremely telling, in both what he did not say, and what he did say.

He could have answered with a question "what are you talking about, who says the genome is fractal?" He did NOT say anything like that, since snippets of genomes had long been turned even into "fractal music" (Ohno, the originator of the misnomer "Junk DNA" [1972] and his wife pursued "fractal DNA music" as their ardent hobby). Fractal dimension-numbers had long been known to be distinctly different for the music of Mozart, Beethoven (etc), with Bartok's "absolute music" compositions being the closest to "fractal DNA sounds".

Mandelbrot's answer, however, was brutally honest: "I don't know".

He could have known and cited some numbers, and he was clearly and extremely interested in fractals in biology.

Mandelbrot just did not have the time & expertise, nor the support to pursue fractals of biology (especially genomics) as he knew the topic called for and deserved. In addition, (as the Eulogy in New York Times observed "For most of his career, Dr. Mandelbrot had a reputation as an outsider to the mathematical establishment") his life was already so full of "going against the grain as a maverick" that the last thing he needed was personal attacks by caricatures like a lonely moron (believe it or not, as of 2013, still clings to both of "flat Earth" misunderstandings of "Junk DNA" and "Central Dogma" contrary to overwhelming scientific evidence for going beyond the disarmingly naive nascence of genome informatics).

Dear Benoit, Farewell and R.I.P. "The curtain did not fall". It has just been opened for the "Prelude". We do not even know how many Acts and Scenes your Opera of Genomics will be composed of. [end of comment by AJP]

A Video Tribute to Benoît Mandelbrot

18 October 2010

On Thursday, mathematician Benoît Mandelbrot died of pancreatic cancer at age 85. Mandelbrot is most famous for coining the term "fractal," which he applied to fields as diverse as physics and finance. In 2007, an undergraduate at Cornell University named Pisut Wisessingcreated this video about the Mandelbrot set, a series of complex numbers that, when plotted, form an intricate fractal. The video is based on a song by Jonathan Coulton, who has penned other science-based music. (Warning: There is some explicit language.)

Watch Errol Morris’ New Documentary, About the Father of Fractals

Equation by brief Cartoon Rap

By Forrest Wickman

Errol Morris begins his new short documentary on Benoît Mandelbrot by asking a wonderfully straightforward question: “The fractal stuff, what was the origins of that?”

Mandelbrot, true to form, approaches this potentially complex question with an answer that is characteristically simple: When he looked at things that seemed extremely complicated to others, they seemed almost transparent to him.

Morris’ new documentary on Mandelbrot takes a similar approach, attempting to distill his work and story into something understandable even for those who don’t know his widely influential fractal geometry.

The short is presented as part of IBM’s “Big Brains, Small Films” series (Mandelbrot worked with IBM for 35 years), but it doesn’t feel like a commercial. It feels like a true Errol Morris film—complete with snazzy reenactments, a minimalist score, and cinematography that uses Morris’ signature Interrotron.

Mandelbrot died just 19 days after sitting down with Morris; IBM says this was his final interview. I’m glad that he got to explain his complex work, in simple terms, one last time.

Forrest Wickman is a Slate staff writer. Email him at


[Readers may wish to refresh their understanding of the utterly simple Z=Z^2+C explanation of mind-boggling "complexity" of The Mandelbrot Set. A lighter, entertainment-version in two and a half minutes is above on YouTube. An almost full hour NOVA Special of "fractals in their full glory" is below - AJP].

Equation by a 1-Hour NOVA Special of Fractals in Full Glory

Would he do it again?

Yes, and those who really know Dr. Pellionisz understand why he would.

Today is exactly the 11th anniversary (November 21, 2013) when a journalist Hal Plotkin wrote a very lucid essay to the electronic version (SF-Gate) of San Francisco Chronicle (see link for full text, with excerpts below)


"In a provisional patent application filed July 31 [2002], Pellionisz claims to have unlocked a key to the hidden role junk DNA plays in growth -- and in life itself.

Rather than being useless evolutionary debris, he says, the mysteriously repetitive but not identical strands of genetic material are in reality building instructions organized in a special type of pattern known as a fractal. It's this pattern of fractal instructions, he says, that tells genes what they must do in order to form living tissue, everything from the wings of a fly to the entire body of a full-grown human. [A large group led by Karolinska Institutet of Sweden revealed just days ago, that "pseudogenes" constituting 40-50% of "Junk" of human DNA are NOT "evolutionary debris" but actually many were found by sophisticated algorithms/computation to also "code for proteins", see entry earlier this week, November 17, 2013]

Another way to describe the idea: The genes we know about today, Pellionisz says, can be thought of as something similar to machines that make bricks (proteins, in the case of genes), with certain junk-DNA sections providing a blueprint for the different ways those proteins are assembled...

Hunt adds that most biologists simply don't know enough about fractals or the advanced math behind them to understand how they might apply to the field of genetic medicine.

"We need someone to tap us on the shoulder and explain it to us," he says. "But if it clicks as a tool, we would be more than happy to use it."

"Overall, we know very little about what is referred to as 'junk DNA,'" he adds. "But every year that goes by, there are more insights into the possible role they might play."

...Pellionisz has been working on understanding the possible linkages between math and physiology since his earliest days as a college student in Hungary, when he first decided to devote his life to understanding how the brain works. It's that pursuit that has helped lead him to his latest ideas, he says...

Experts generally agree that a breakthrough in figuring out the role junk DNA plays, if any, would represent a spectacular advance in our understanding about how DNA in general turns inanimate matter into living organisms. If that happens, humanity would take a giant leap toward gaining control of the machinery of life itself, which would open up a wild new frontier in medicine and science that could lead to everything from growing new organs designed for specific patients to preventing and curing any health- or age-related problems that have a genetic origin or component.

Pellionisz says his main goal is to set the stage for the next and even more promising generation of research into genetics. Given the fact that he may be the first person to assert a patent on intron fractal counting and analysis, it's also conceivable that Pellionisz could wind up with related commercial rights worth billions of dollars. If he's wrong, of course, any patent he might receive will be worthless. And even if he's right, he could have to contend with other inventors who may also have recently filed similar patent claims that, like his, have not yet been fully disclosed.

It could be years, even decades, before the dust settles and Pellionisz learns whether his patent application has any real merit, as well as whether someone else beat him to the punch with an earlier enforceable patent claim.


One has to admit at the outset, that the decade to win the issued patent was not exactly easy.

In accordance with Prof. Hunt's original comment that "most biologists simply don't know enough about fractals or the advanced math behind them to understand how they might apply to the field of genetic medicine". Fortunately, advised by patent lawyers that FractoGene patent submission would be examined by biologists, the sole inventor had the choice of either put in extremely narrow, "easy to understand" claims (that would severely limit its potentially enormous scope) - or follow a much more sophisticated patent strategy towards the broadest claims, for which a finessed teaching was necessary. Here is Claim 1 (of 21) and some non-obvious comments on the finesse.

CLAIM 1 (of 21, the broadest claim is abbreviated below by omitting "one or more" strings and emphasis added):

"A method to analyze and interpret information inherent in hereditary material of organisms in terms of fractal sets, in relation with resulting fractal structures and fractal functions of said organisms, such that said fractal sets are defined as a superposition over at least two iterations of a fractal template"

How could Dr. Pellionisz successfully obtain the legal right of monopoly controlled by such an enormously broad claim (even with a decade of waiting and absorbing a couple of $ Millions in costs & lost revenue "out of pocket")?

First, look at the (overlooked) expression in the Claim "in relation with". Fractal properties of either organisms (trees, lungs, etc) or those of the DNA (even in the form of "fractal DNA music") have been looked at in the prior art. Many-many scientists looked at the apparent fractality in various organisms, in various genomes - but they missed to see the cause-and-effect relationship - resulting in colossal utility. "In relation with" was brought into the teaching by invoking not only the mathematical arsenal of fractals, but also of statistics and probability theory. All in a way that is digestible for a patent examiner who is likely to be a biologist!

Fortunately, the legal mechanism of Paragraph 2163.07(b) could be invoked "Incorporation by Reference". According to this piece of law, see "Instead of repeating some information contained in another document, an application may attempt to incorporate the content of another document or part thereof by reference to the document in the text of the specification. The information incorporated is as much a part of the application as filed as if the text was repeated in the application, and should be treated as part of the text of the application as filed."

Utilization of this legal venue led to a highly mathematical teaching (with attachments "incorporated by reference" amounting to some 750 pages of advanced mathematics). It is possible that for a full decade the examiner never had a chance to read it in full, as in the last legal round USPTO required submission of all "methods" (with some of the papers/books not easily found and certainly could not be just "clicked").

In the same week when ENCODE 2012 avalanched biologists, Notification of Issue of FractoGene patent arrived. The "fractal approach" is now in the "prize-worthy box" instead of the bin for formerly "lucid heresy" items.

Intellectual Property experts know, that the Inventor of "Best Methods" described in the Application (with last CiP of 2007) had by now 6 additional years of "Improving Best Methods". These improvements are not disclosed in the now public patent - but as is customary, held as "Trade Secrets". Thus, while the patent is in force till late March of 2026, "Trade Secrets never expire".

Given the scope of Industrialization of Genomics, enforcement of the patent will be greatly assisted by those global IT/Health companies that are already committed to this "next big thing" (SAMSUNG for over 2 years eyes the market of US hospitals and SIEMENS, PHILIPS, SONY etc are not very far behind). The lists of already committed or "getting ready" USA-based global IT/Chip/Health companies need not (and now legally can not) be belabored. The market is segmented to "fierce competitors" to exclusively auction-out selective arch-rivals. Segments are compute-platforms, geographical regions, targeted diseases (for diagnostic and therapeutic purposes), hospital systems, and other rivals.

With Thanksgiving coming up, time is for R&R&R, with "Reflecting" perhaps the most unusual. The inventor and sole patent holder (with the next generation already spruced) already enjoyed a few nice days of contemplation.

Yes, even against all odds experienced (an unusual decade, admittedly peppered with some rather astonishing features), he would do it again. "It is in the genome" one might say. Though, as we learnt the other day, facial etc. features that are individual to all are NOT in the genes. Creativity, too, seems to lurk in the (formerly) intergenic regions...

Junk Genes Of Protein Codes Might Be Helpful To Understand Cancer

Nov 18th, 2013

Researchers have revealed that approximately one hundred human gene regions, also called as pseudo genes, which code the proteins might have a relationship with cancer.

Generally, only 1.5% of DNA or human genome has gene with protein codes. Among the other DNAs, few sequences are used to regulate the ability of the genes to produce proteins. Rest of the DNA are used to maintain the production of genes and regulate the proteins but the bulk DNA is found to have no purpose and is termed often as the junk DNA.

These junk DNA possesses pseudo genes that were regarded as genes with no functional purpose. Such genes are believed to be the remains of genes, which have lost their ability to function during the process of evolution.

Researchers have exhibited now that the novel method of proteogenomics makes it possible to hunt the protein code genes out of the 98.5%, which is remaining. This is something that has been an impossible task until now as it is not possible to proceed with. The research also indicates that some pseudo genes have the ability to produce proteins, which would mean that they have a perfect function.

Janne Lehtio, Asst. professor, lead author of the study, added that there is enough evidence for the presence for over 100 regions with new protein codes present in the human genome. Similar study results were also derived from the body cells found in mice. The next objective on the agenda of the researcher is to find out whether these junkyard genes in human genome play a vital role in finding cancer or other such medical conditions.

Dr. Lethio added that their challenges in the study regarding the old theory, which stated that there was no code for pseudo genes with respect to proteins, asserted that their latest method paves way to annotation of genome that are protein based in organisms that have complex genomes. This might lead to the discovery of novel protein codes from genes not only among the human beings but also among all species that have a DNA sequence.

[In the "Old School", there was no such thing as "Junk Gene". Parts of DNA were thought to be either "Genes" - the rest was labeled "Junk" by Ohno (1972, see facsimile of the original here , pseudogenes specifically junked "for the importance of doing nothing"). The Old School's Junk (DNA) Is The New School's Treasure. Karolinska-led paper literally blows both "Junk" & "Gene" categories into pieces (elsewhere a new word of "mosaic" is on its way - towards FractoGene). Though the Nature Methods paper mentions "cancer" by reference (once, to Kalyana-Sundaram, 2012), one finds among the sponsors both the Swedish Cancer Society and Stockholm's Cancer Society. Cancer will not be diagnosed, treated nor cured by any "gene/whatever re-definition". "Cancer is a digital disease and will receive a digital cure" - said one of the most mathematics-savvy genomist (Dave Haussler). When the axiom was brutally violated that "the atom was the smallest particle of any element, that could not split", for a while scientists went on auto-pilot with mesmerized diligence of "old school chemistry" - when the atom did split. Physicists, on the other hand, started to build quantum-theory to (literally) arm mankind for the new challenge of the inevitably upcoming "nuclear age". Fractal mathematics is already mentioned in journalism as prize-worthy in the quest against cancer . Funds are important to deploy software-enabling theory going - as disarmingly naive old-school theories finally need to be junked (Mattick). Assuming, or course, that we decide (along with IBM, Dell, Amazon, Google, Apple, Samsung, etc) to change the steady course of zillions suffering misearble deaths, including "Steve Jobs" potentially empowered to bring in the big guns of IT industries in time - AJP]

HolGenTech Board of Advisers News - in a Perspective

The “Internet Boom” (1984-2000) repeats itself in “Genome IT Boom” (in R&D stage 2000-2012 and now in industrial stage 2013-2026). A few orders of magnitude bigger and bolder than anything we have seen.

With the present consolidation of the Industrialization of Genomics the era of fireworks of conflicts of interest dawned on us. Triggered by the Calico, Google employee Zoltan Egyed was forced to resign by November 7, 2013 from Board of HolGenTech to negotiate FractoGene IP “at arms length” with major IT that are already committed to clinical genome analytics (see "Board" on HolGenTech website).

On one hand, firms already engaged in Clinical Genome Informatics like Google/Apple and IBM/Intel/Microsoft/Amazon/Dell/Xilinx/NVIDIA etc. attempt to prevail in a “sustained mode” inching through the most turbulent paradigm-shift in history, perhaps ever. Sector-selective exclusive IP-licence of FractoGene can be a safety-net for a protected tightrope-walking. On the other hand, ex-Google, ex-Facebook etc. entrepreneurs/investors are likely to hyper-escalate in a “disruptive mode” (in the manner formerly seen and proven by Bill Gates, Steve Jobs - now similar to the unique opportunities that Yuri Milner grabbed).

Dr. Egyed (Googler) introduces Dr. Pellionisz' Google Tech Talk YouTube 2008 (Is IT ready for the Dreaded DNA Data Deluge?)

Dr. Pellionisz' Google Tech Talk YouTube 2008 projects that the genome analytics "bottleneck" is NOT IT (technology), but IT (theory; a suitable algorithmic approach that e.g. was the foundation of Google...)

An example how to steer clear of “conflict of interest” and “negotiate at arms length”, Zoltan Egyed was already a Google employee in 2008, when he introduced Google Tech Talk “Is IT Ready for the Dreaded DNA Data Deluge” a mere five years ago. One may wish to re-view the presentation, how it warned against a mega-billion dollar failure of sequencing unmatched with analytics, etc, etc. Later, Dr. Egyed became a Board of Adviser member to HolGenTech, Inc. founded by Dr. Pellionisz. In 2009 Churchill Club panel by Dr. Pellionisz, Dr. Nikolich quoted [at 1:04:00] “What if Microsoft would acquire a pharma company or Google will build or acquire an IT-led pharma?”. By now Google already stepped beyond their discontinued “Google Health” by starting Calico (with Apple and even Roche/Genentech built-in ties)! Microsoft made early exploration by Paul Allen a decade ago (Regulome) and the MS “Health Vault” still seems to be alive. For a healthy survival of challenged Microsoft, some fundamental decision is due for Genome Informatics.

Now with Calico of Google/Apple (in itself an interesting business and intellectual property structure, especially with Roche/Genentech), Dr. Egyed as still a Google employee resigned of his HolGenTech Board of Advisers membership to avoid any “inside-deal”.

Samsung, Sony, Tata, Siemens, Philips, BGI/Complete Genomics round up the global horse-race.

FractoGene US patent 8,280,641 for fractal diagnosis and therapy in genomics was filed in 2002 (see a most lucid early write-up by Hal Plotkin in San Francisco Chronicle's SF Gate) A rather profound Claim 1 (of 21 of FractoGene patent) is "A method to analyze and interpret information inherent in hereditary material of one or more organisms in terms of one or more fractal sets, in relation with one or more resulting fractal structures and one or more fractal functions of said one or more organisms, such that said one or more fractal sets are defined as a superposition over at least two iterations of a fractal template"

Patent is in force till late March, 2026

The Genome Is Fractal - Loops Of Minimal Distances Are Key To "Recursive Genome Function", Enabling "Distance Function"

[In a Science Cover article in 2009 the Hilbert-fractal was invoked to explain the ultra-tight, knot-free folding of the DNA-strand, referring to pioneering work by Alexander Grosberg decades earlier . Experimental evidence for "fractal folding" was provided in the 2009 Science paper by the Hi-C method , revealing that the fractal folding brings into minimal distance of each and every part of the DNA sequence, parts that would be "a great distance apart" if one would inch along the 2 yard long linear thread. While not much attention was given why such "minimal distance" is functionally advantageous, Pellionisz pointed out in this website that his 2008 "The Principle of Recursive Genome Function" called for minimal distance loops to make recursive (fractal iteration) function effective. Very recently a couple of landmark papers provided evidence that "distance function" plays an essential role in genome regulation , essentially providing new dimensions to the classical notion originating from Jacobs and Monod (1961, Nobel in 1965) that "repressors" and "promoters" are linearly adjacent to particular gene(s). It might take a while to sink in that DNA function has long been misunderstood, based on the (mistaken) impression that the (human) 3.1 Bn A,C,T,G-s, just because they are stored and duplicated as two intertwined linear strands of the double helix - "must" also function sequentially.

No longer true. Novel experimental evidence is mounting that the DNA functions as a massively parallel processor, where even "linearly distant" sequences are brought into functional proximity by the utterly clever "fractal folding". Core authors of the 2009 Science paper (with additional members to the team) came forward now in 2013 with a major publication also in Science (see a journalist's write-up below, with link to the Science paper), clinching that loops (the essential conduits of recursion) "are key to fractal globule". Pellionisz' "FractoGene", since 2002 secured the utility of "fractal DNA governs fractal growth". To get a sense of the algorithmic software-enabling approach, please keep in mind that even the simplest and best known fractal (Mandelbrot set) is no more but no less than a "recursion of parallel elements". In the Mandelbrot set, every new state generated by the previous one via the utterly simple Z=Z^2+C repeat. However, every "Z" is in itself a parallel entity; a complex number (a composite of two elements, one real and one imaginary). Quarternions, as their name dictates, are composites of not two, but of four elements (the recursion resulting in "Mandelbulb fractal"). FractoGene is now in clinical applications against cancer (since "fractal defects" were found by essentially the same Boston core-group, see their Proof of Concept result that the "fractal globule" is compromised). Dozens of more PoC experimental studies are cited in Hyderabad Proceedings, 2012. In the era of "Industrialization of Genomics", however, "best methods" (since the last CIP in 2007 of FractoGene) can not be freely disclosed. However, the IP can be licensed. - Dr. Pellionisz, reach by holgentech_at_gmail_dot_com]


Scientists find that loops of DNA are key to tightly packing [fractal] genetic material for cell division

CAMBRIDGE, Mass. - Scientists first discovered chromosomes in the late 1800s, after the light microscope was invented. Using these microscopes, biologist Walter Flemming observed many tightly wound, elongated structures in cell nuclei. Later, it was found that chromosomes are made from DNA, the cell’s genetic material.

Since then, scientists have proposed many possible ways that DNA molecules might fold into 3-D condensed chromosomes. Now, researchers at MIT and the University of Massachusetts Medical School have obtained novel data on the 3-D organization of condensed human chromosomes and built the first comprehensive model of such chromosomes.

In this model, DNA forms loops that emanate from a flexible scaffold; the loops are tightly compressed along the scaffold. “This is a very efficient way of packing DNA material,” says Leonid Mirny, an associate professor of health sciences and technology and physics at MIT and a senior author of a paper describing the findings in the Nov. 7 online edition of Science. [See their Science paper, elaborating on fractal properties that enable "distance action" through loops here - AJP]

This condensed state, seen only when cells are dividing, allows cells to neatly separate and distribute their chromosomes so that each daughter cell receives the full complement of genetic material. At all other times, the chromosomes are more loosely organized inside the cell nucleus.

Job Dekker, a professor of biochemistry and molecular pharmacology at UMass, is also a senior author of the paper. Lead authors are MIT graduate student Maxim Imakaev, Harvard University graduate student Geoffrey Fudenberg, and UMass postdoc Natalia Naumova. Other authors are UMass researcher Ye Zhan and UMass bioinformatician Bryan Lajoie.

Layers of structure

Chromosomes are complex molecules with several levels of organization, allowing cells to cram 2 meters of DNA into a nucleus that is only one hundredth of a millimeter in diameter. Long strands of DNA wind around proteins called histones, giving rise to a “beads on a string” structure. Several models have been proposed to explain how those strands of millions of beads are arranged inside tightly packed chromosomes.

“There is no shortage of models of how DNA is folded inside a chromosome,” says Mirny, who is a member of MIT’s Institute for Medical Engineering and Sciences. “Every high-school biology textbook has a drawing of chromosomes folding. If you look at these drawings you might get the impression that the problem has been solved, but if you look carefully you see that all these drawings all very different.”

To help determine which model is correct, the researchers used a technology developed in Dekker’s lab called Hi-C, which performs genomewide analysis of the proximity of genomic regions. This reveals the frequency of interaction for every pair of regions in the entire genome.

The challenge, however, lies in generating an overall chromosome structure based on Hi-C data. “Given a three-dimensional structure, it is straightforward to find all contacts; however, reconstructing three-dimensional structures from contact frequencies is much more difficult,” Imakaev says.

In 2009, researchers including Imakaev, Mirny, and Dekker used Hi-C to demonstrate that during most of a cell’s life, when it is not dividing, DNA is organized as a fractal globule, in which DNA is not tangled or knotted.

Hi-C also showed that regions with more active genes tend to cluster together in easily accessible compartments, and unused regions form more densely packed clusters. The organization of each chromosome varies among cell types, because every type of cell uses different sets of genes to carry out its function. This means that each chromosome acquires a specific 3-D organization depending on which genes a cell is using.

Chromosomes during cell division

In the new paper, the researchers found that as cells begin to divide, chromosomes are completely reorganized. First, all chromosome-specific and cell type-specific patterns of organization, which are necessary for gene regulation, disappear. Instead, all chromosomes are folded in a similar way as cells begin to undergo cell division, or mitosis. However, the chromosomes do not form the exact same structure every time they condense.

“Unlike proteins, which fold into very defined structures, the chromosomes form a completely different condensed object every time,” Fudenberg says. “It appears similar macroscopically but the individual regions of the genome can be folded in very different ways in different cells.

The Hi-C technique “provides a modern day molecular microscope, with the power to see inside of these bodies and elucidate their principles of organization,” wrote Nancy Kleckner, a professor of molecular and cellular biology at Harvard University, in a perspective article accompanying the Science paper. The researchers “combine chromosome conformation capture with polymer physics simulations to provide a new, yet satisfyingly familiar, view,” she wrote.

The researchers believe that two stages are required to achieve the loop-on-a-scaffold structure: First, the chromatin forms loops — each of which contains about 80,000 to 120,000 DNA base pairs - radiating out from a scaffold made of DNA and some proteins. Then, the chromosome compresses itself along its central axis, where the scaffold is located.

While molecular details of the second stage remain mysterious, scientists have a good guess for what might be responsible for the first stage of chromosome folding: A team at Northwestern University recently proposed that proteins called condensins drive chromosome condensation by latching on to the DNA and extruding loops. To test this hypothesis in greater detail, the MIT team is now collaborating with these researchers.

Beyond characterizing condensed chromosomes, this study also opens the door for future work to understand mechanisms of chromosome condensation, cell memory, and epigenetic cell reprogramming.

The research was funded by the National Cancer Institute, the National Human Genome Research Institute, the Human Frontier Science Program, and the W.M. Keck Foundation.

"Junk DNA" as "Doing Nothing" became a Laughing Matter - Fractal Recursive Iteration is the Prevailing Paradigm of Genome Regulation!

[Composite of the Guardian-coverage illustration at right (titles added), rendering "Junk DNA" (as "doing nothing") literally a "laughing matter". Junk DNA was defined by Ohno in 1972 as DNA "for the importance of doing nothing", see page 367 of facsimile reproduced in its entirety. The "about face" ideology-spinners, see YouTube "Encode 2013" and leftover bloggers now try to substitute "Junk" by "Unknown function". The higher is their "declared percentage of ignorance", the more culpable and legally vulnerable is their negligent "science teaching". Diagram on the left is the classic cover-page of a "fractal face" by Michael Barnsley (original edition in 1988). "FractoGene" (The utility of fractal DNA growing fractal organisms) priority-date is 2002, now issued US patent) is a software-enabling algorithmic approach to fractal recursive genome function, see peer-revieed publications, PubMed 2006, 2008). - Pellionisz, contact HolGenTech_at_gmail_dot_com]

On October 25, 2013 a Science paper appeared by 20 authors from Lawrence-Berkeley (USA), San Diego (USA), Edinburgh (UK) and Calgary (Canada), with lead authors Catia Attanasio and Axel Visel. The science-focus of experimental findings is " Fine Tuning of Craniofacial Morphology by Distant-Acting Enhancers".

Readers with strong science background will find that the key-word of their title is "DISTANT-acting enhancers". Promoters, suppressors etc. have long been found in non-coding DNA since Jacobs and Monod (1961, Nobel in 1965). However, "distant-action" in the fractal DNA (see FractoGene software enabling genome regulation theory since 2002) has been increasingly in the forefront since e.g. the Hilbert-fractal minimizes functional distances of linearly remote part of the DNA-strand (see Pellionisz, 2012, interpretation of "copy number alterations" clogging transparency and thus becoming "fractal defects", already linked to cancer, autism, auto-immune diseases, schizophrenia etc.).

The impact of the paper on general readership is substantial since the title refers to "craniofacial morphology". While the science was done in mice, it is known since 2002 that the mere 20,000 or so "genes" of both humans and mice are 98% homologs and genome regulation in both multicellular vertebrate mammals is likely to be based on shared functional principles.

In case of homo sapiens, "craniofacial morphology" is otherwise known as "your face".

The "in your face" results of genome regulation, therefore, made the infamous "Junk DNA" misnomer (specifically arguing "the importance of doing nothing") literally a laughing matter; see press coverage by Lawrence Berkeley (Lynn Yarris; "What is it about your face?") and in the Guardian (Alok Jha; "Faces are sculpted by 'junk DNA'"). The wave of publicity reached even Huffington Post [and some 38 other media since the newsbreak].

What is it about your face? [Press release by Lawrence Berkeley, Lynn Yarris]

The human face is as unique as a fingerprint, no one else looks exactly like you. But what is it that makes facial morphology so distinct? Certainly genetics play a major role as evident in the similarities between parents and their children, but what is it in our DNA that fine-tunes the genetics so that siblings – especially identical twins – resemble one another but look different from unrelated individuals? A new study by researchers at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) has now shown that gene enhancers – regulatory sequences of DNA that act to turn-on or amplify the expression of a specific gene – are major players in craniofacial development. Berkeley Lab researchers identified distant-acting transcriptional enhancers in the developing craniofacial complex and studied them in detail in transgenic mice.

Faces are sculpted by 'junk DNA' [Press coverage by Guardian, Alok Jha]

Scientists have identified thousands of regions in the genome that control the activity of genes for facial features 'Transcriptional enhancers' switch genes on or off in different parts of the face... Researchers have started to figure out how DNA fine-tunes faces. In experiments on mice, they have identified thousands of regions in the genome that act like dimmer switches for the many genes that code for facial features, such as the shape of the skull or size of the nose.

Specific mutations in genes are already known to cause conditions such as cleft lips or palates. But in the latest study, a team of researchers led by Axel Visel of the Lawrence Berkeley National Laboratory in Berkeley, California, wanted to find out how variations seen across the normal range of faces are controlled.

Though everybody's face is unique, the actual differences are relatively subtle. What distinguishes us is the exact size and position of things like the nose, forehead or lips. Scientists know that our DNA contains instructions on how to build our faces, but until now they have not known exactly how it accomplishes this.

Visel's team was particularly interested in the portion of the genome that does not encode for proteins – until recently nicknamed "junk" DNA – but which comprises around 98% of our genomes. In experiments using embryonic tissue from mice, where the structures that make up the face are in active development, Visel's team identified more than 4,300 regions of the genome that regulate the behaviour of the specific genes that code for facial features.

The results of the analysis are published on Thursday in Science.

These "transcriptional enhancers" tweak the function of hundreds of genes involved in building a face. Some of them switch genes on or off in different parts of the face, others work together to create, for example, the different proportions of a skull, the length of the nose or how much bone there is around the eyes.

"If you think about face development, a gene that is important for both development of the nose and the mouth might have two different enhancers and one of them activates the gene in the nose and the other just in the mouth," said Visel.

"Certainly, one evolutionary advantage that is associated with this is that you can now change the sequence of the nose or mouth enhancers and, independently, affect the activity of the gene in just one structure or the other. It may be a way a way that nature has evolved in which you can fine-tune the expression of genes in complex ways without having to mess with the gene itself. If you destroy the protein itself that usually has much more severe consequences."

In further experiments to test their findings, the scientists genetically engineered mice to lack three of the enhancers they had identified. They then used CT (computed tomography) scanning to build 3D images of the resulting mouse skulls at the age of eight weeks.

Compared with normal mice, the skulls of the modified mice had microscopic, but consistent, changes in the length and width of the faces, as expected. Importantly, all of the modified mice only showed subtle changes in their faces, and there were no serious harmful results such as cleft lips or palates.

Though the work was done in mice, Visel said that the lessons transfer across to humans very well. "When you look at the anatomy and development of the mouse versus the human, we find that the faces are actually very similar. Both are mammals and they have, essentially, all the same major bones and structures in their skulls, they just have a somewhat different shape in the mouse. The same genes that are important for mouse face development are important in humans."

Visel said that the primary use of this information, beyond basic genetic knowledge, would be as part of a diagnostic tool, for clinicians who might be able to advise parents if they are likely to pass on particular mutations to their children.

Peter Hammond, a professor of computational biology at University College London's Institute of Child Health, who researches genetic effects on facial development, said understanding how faces develop can be important for health.

"There are many genetic conditions where the face is a first clue to diagnosis, and even though the facial differences are not necessarily severe the condition may involve significant intellectual impairment or adverse behavioural traits, as well as many other effects," he said. "Diagnosis is important for parents as it reduces the stress of not knowing what is wrong, but also can be important for prognosis."

The technology to go beyond diagnosis and make precise corrections of the genome does not yet exist and, even if it did, it is not clear that changing genes or enhancers to create "designer" faces would be worthwhile. "I don't think it would be desirable to even attempt that. It's certainly not something that motivates me to work on this," said Visel. "And I don't think anyone working in this field would seriously view this as a possible motivation."

["The Face of Former Junk DNA" is important not only to bring the heretofore "faceless" issue of misnomer "Junk DNA" to immediate grasp of everyone. It is also cardinal as it focuses on "genome regulation" as a "distance-function", and the paper specifically mentions (though does not elaborate how) "Copy Number Variations". "CNV"-s ("repeats" with as many "definitions" as many genomists you ask) are the new and trendy candidates of physiologically connecting (or pathologically clogging e.g. as independent Proof of Concept experiments have proven in case of cancers) distant parts of the genome according to the Principle of Recursive Genome Function. The paper, therefore, "is not just another pretty face" but directs attention to causes that are likely to lurk behind e.g. cancers. "Junk DNA" became a laughing matter for survivors. However, hundreds of millions do not laugh any more. Countless zillions died of miserable deaths of e.g. cancers over the many decades while first a scientifically mistaken, later a conveniently ignorant attitude, and in its terminal phase an ideology-warfare (of "belief-systems") stood in the way of strict (mathematical) science-progress. As explained both in peer-reviewed paper (Pubmed, full material free) and widely disseminated in Google Tech Talk YouTube (both in 2008), "Junk DNA" and "Central Dogma" erroneous axioms, in a dove-tailing manner, blocked the take-off of "Principle of Recursive Genome Function"; towards fractal iteration of growth. Lately, while fractals are quite warmly welcome, obstinate detraction and its dishonest record is likely to linger - but perhaps not in taxpayer-supported mode. - AJP]

Improving genome understanding

The cost and accuracy of genome sequencing have improved dramatically. George Church asks why so few people are opting to inspect their genome.

Nature, 09 October 2013 (Corrected: 09 October 2013 [According to Nature Search, the original title was "Genomics is Mired in Misunderstanding"]

Readers of Nature, we can assume, are bright and insatiably curious. So why have so few obtained and interpreted their own genome sequence? We should avoid being judgemental of people who practise genomic modesty or who choose not to act on genome information, but we should also ask if we are providing adequate and equal access to education about the benefits and risks of genome information.

For 7 years I led one of the teams registered to compete for the US$10-million Archon Genomics X Prize, and I was naturally disappointed by the abrupt cancellation of the competition in August. However, the confusion surrounding the X Prize does provide an occasion to reflect on the problems and misunderstandings in genomics. The first is that genomics is seen as expensive. In fact, sequencing costs have plummeted — from $2.7 billion for the first human genome in 2003 down to $1,000 today. That’s not much more than the cost of a decent laptop, and much less than a car. However, people are reluctant to pay to have their genome sequenced — many feel that health care should be provided for free by insurance or the government and, indeed, this is our not-that-distant goal, as there are many in our community who would not benefit from genome information if it were not free. However, for those today who can afford a genome sequence, we would argue that, overall, the cost of sequencing is expected to be recovered over a lifetime through the avoidance of unnecessary diagnostics and therapeutics and time spent in waiting rooms and hospitals.

Perhaps too many think that genomics is inaccurate. When it announced the cancellation, the X Prize Foundation claimed that “no company is sequencing whole genomes to the accuracy the contest required”. Aside from the pre-judgemental weirdness, is this statement true? Haplotype phasing quality — a measure of accuracy — has improved from 350 kilobases in 2007 to 2,463 kilobases in 2013, and point errors have improved from 1 in 100,000 to 1 in 10 million — both well beyond the X Prize goals. Genetic analyses of tough tandem repeats are now common diagnostically.

Are the results uninterpretable? Even if we place the As, Cs, Gs and Ts in the right order, how does this help? Genome-wide association studies (GWAS) and studies of twins can give the impression that predicting traits from genomic sequence is a haphazard science. But since 1991 the number of highly predictive gene tests has risen from two to 3,000. Even ‘complex’ traits include components that can be identified and applied clinically to individuals who are not classed to be directly at risk. For example, height and diabetes GWAS have shown that a vast number of common variants have small effects, but the alternative of seeking rare variants reveals large effects by altering levels of growth hormone for height and insulin for diabetes. These hormones are effective therapies even for individuals who are not mutant in them. Too often the messy results of GWAS and twin studies are down to poor selection of subjects and neglect of confounding environmental factors.

Even if they are interpretable, are the results useful? Yes! Even if there is no cure for the genetic conditions identified, there are effective preconception and prenatal options that could have an impact on the family. For example, Ashkenazi communities already use genetic screening to make lists of suitable marital partners early in life to avoid their offspring developing painful Tay–Sachs disease and more than 20 similarly devastating diseases (which are not restricted to their community, by the way). Although we are tempted to restrict genomics to those with ethnic or family risks, the fact is that we are all at risk. Even the possibility of finding markers for one treatable disease (such as a cancer or cardiomyopathy) could, for some, be a sufficient reason to check one’s genome.

Perhaps most provocatively, some critics assert that genomics could be harmful. The US Genetic Information Nondiscrimination Act (GINA) prevents discrimination based on health insurance and employment; however, there is not a GINA in every country, and it doesn’t cover the military, life insurance or person-to-person discrimination. But the question is: do the overall benefits of genomics exceed the risks? Do the benefits of driving trump the one-and-a-quarter million traffic-related deaths per year? A growing number of bioethicists and researchers are worried that typical consenting practices do not inform patients of the likelihood of data escape and re-identification. Certainly, conventional consents served to protect the researchers, not the volunteers. However, the huge numbers of volunteers who are willing to share their genetic data make this a moot point. Why insist on recruiting those — and setting policy around those — who would be upset if their data escapes?

It is important for those of us at the sharp end of work on genomics to work equally hard at conversations with the public. We already share our (very revealing) faces, voices and opinions. And, as we share more of our genetics and as we develop genomic progress into precision medicine, researchers and the public alike need frank assessments of all of these tests and treatments. We need the Genomics X Prize more than ever.

Nature 502, 143 (10 October 2013) doi:10.1038/502143a

[Excerpts from Comments]

Andras Pellionisz

Prof. Church answers with the title of his lucid assessment what exactly is required to streamline genomics for clinical use. The pendulum of interpretation must swing first from "knowledge" to "understanding". Nobels have just been dished out in Physics and Chemistry for predictive and experimentally testable theories and computer modeling to cope with data-sets incomprehensible for the un-aided human brain. However, software-enabling algorithmic understanding of recursive genome function is still at a pioneering stage.

A handful of cynics can not let go their "old school", ex-cathedra condemning the majority of 6.2 Bn DNA-bases "for the purpose of doing nothing" [So conceited, full of themselves -mostly junk- that bypass even a friendly guesture for a graceful way out]. Perhaps people easily overlook such negligence today, but attitudes will change dramatically as genome interpretation will allow us to differentiate between successful treatment and persistent or aggravated illness. Old school mentality is in fact responsible for millions of unnecessary deaths. Computer-aided genome interpretation could have made a difference for Steve Jobs, the high-tech genius. Tragically, he died of cancer at the peak of his life surrounded by his Silicon Valley IT company. Jobs left behind the most valued company in the world (Apple), while running out of time to develop advanced genome interpretation via computers and information technology!

Ironically, most of us would not consider checking into a hotel where there is no Wi-Fi. If we can’t connect our iPhone, iPad, or "iEverything", we turn away from such antiquated businesses. Shouldn't we have the same attitude toward genome based healthcare? Shouldn't we demand treatments such as precision chemotherapy based on our individual genomes?

So when is the daybreak of genomics for the masses? When the tide of patients and health-conscious movers & shakers, before checking into any hospital, will demand an answer to the vital question "What Can You Do For MY Genome"? Once the trend is sparked, masses will follow and insist on cancer therapy which is based on genome sequencing and computer-aided interpretation!

All Bets Are Off: Silicon Valley Goes For IT Google-Style!

Can Google Calico Conquer Death?

Time Magazine Cover Issue

We all know that Google has created many a miracle — but, can it conquer death?

Today, the company announced Calico, a new health-and-wellness initiative that aims to put the company's ingenuity into its biggest challenge yet: conquering cancer and, ultimately, death. Google CEO Larry Page said in a statement: "Illness and aging affect all our families. With some longer term, moon-shot thinking around health care and biotechnology, I believe we can improve millions of lives. It’s impossible to imagine anyone better than Art — one of the leading scientists, entrepreneurs, and CEOs of our generation — to take this new venture forward.” Apple chairman Arthur D. Levinson, who will head up the organization, added that he's devoted a lot of energy toward advancements in tech and science designed to make life better (and longer), and is excited about Calico's prospects.

A corresponding TIME Calico cover story goes much deeper, with Google CEO Larry Page explaining why he thinks his latest "moon shoot" is not such a crazy idea, given Google's determined unconventionality and resources. Just don't expect immediate impact.

"In some industries, it takes 10 or 20 years to go from an idea to something being real. Health care is certainly one of those areas," Clark told TIME. "We should shoot for the things that are really, really important, so 10 or 20 years from now we have those things done."

Does that mean a cancer cure is in the works? It's obviously on Calico's collective mind, but it's not necessarily the first priority. "One of the things I thought was amazing is that if you solve cancer, you’d add about three years to people’s average life expectancy," Page says. "We think of solving cancer as this huge thing that’ll totally change the world. But when you really take a step back and look at it, yeah, there are many, many tragic cases of cancer, and it’s very, very sad, but in the aggregate, it’s not as big an advance as you might think." What Page really wants to focus on is researching how to eliminate a greater spectrum of aging problems and illnesses, and luckily, Google has a whole lot of funding ($54 billion, to be exact) to put toward that odyssey. [Funding of NIH dipped seriously below $30 billion - AJP]

"Immortality" is the buzzword on everyone's lips, as well as Calico's assumed long-term goal, though Page hasn't officially confirmed this. If that all sounds a bit Neuromancer to you, take heart and remember that Google engineer (and renowned futurist) Ray Kurzweil has already made it clear that transcending biology — via upgrading the human body like software — is in his, and now his company's, very real career aims. So, strap in, this will be a long, possibly infinite ride.


Google announces Calico, a new company focused on health and well-being

MOUNTAIN VIEW, CA – September 18, 2013 – Google today announced Calico, a new company that will focus on health and well-being, in particular the challenge of aging and associated diseases. Arthur D. Levinson, Chairman and former CEO of Genentech and Chairman of Apple, will be Chief Executive Officer and a founding investor.

Announcing this new investment, Larry Page, Google CEO said: “Illness and aging affect all our families. With some longer term, moonshot thinking around healthcare and biotechnology, I believe we can improve millions of lives. It’s impossible to imagine anyone better than Art—one of the leading scientists, entrepreneurs and CEOs of our generation—to take this new venture forward.” Art said: “I’ve devoted much of my life to science and technology, with the goal of improving human health. Larry’s focus on outsized improvements has inspired me, and I’m tremendously excited about what’s next.”

Art Levinson will remain Chairman of Genentech and a director of Hoffmann-La Roche, as well as Chairman of Apple. [Plus Chairman of the Board of "Breakthrough Prize" - AJP]

Commenting on Art’s new role, Franz Humer, Chairman of Hoffmann-La Roche, said: “Art’s track record at Genentech has been exemplary, and we see an interesting potential for our companies to work together going forward. We’re delighted he’ll stay on our board.”

Tim Cook, Chief Executive Officer of Apple, said: “For too many of our friends and family, life has been cut short or the quality of their life is too often lacking. Art is one of the crazy ones who thinks it doesn’t have to be this way. There is no one better suited to lead this mission and I am excited to see the results.” [Tim did not have to mention an ultimately examplary person by name for a key issue that some kind of death is inevitable for all of us - but some diseases, a prominent example is cancer, deprive not only longevity but even a decent quality of life in some cases for many-many years of terrible suffering. - AJP]

Contact Leslie Miller

Google Corporate Communications


WTF Is Calico, And Why Does Google Think Its Mysterious New Company Can Defy Aging?

Gregory Ferenstein


September 19, 2013

Thursday, September 19th, 2013

The sad truth is that, if everyone on the Forbes 400 list simultaneously (and tragically) got cancer, or Parkinsons (or any given disease for that matter), the world would probably be well on its way to finding a cure for these illnesses, thanks to the enormous wealth that would be incentivized to back those efforts.

Finding a cure for an intractable disease requires time, enormous amounts of human and financial capital, cooperation and research — and at least a few public-private partnerships. It’s costly, and it’s messy. This is why Calico, Google’s newest mad science project, is potentially so exciting.

In fact, Calico could represent the company’s largest health-care initiative since Google Healthsprinted its way into obscurity. Of course, Google is a different company today than it was in 2008 (when it launched Google Health) and so are we. Our habits have changed: Today, 20 percent of smartphone users have downloaded at least one health app, and 60 percent of adults now look for health information online.

Led by former Genentech CEO and current Apple Chairman, Arthur D. Levinson, Calico has big plans in health care — at least over the long term. From what we’ve heard thus far, the new project will leverage Google’s massive cloud and data centers to help facilitate research on disease and aging by mining its trove of data for insight into their origins.

Plus, thanks to its investment in 23andMe, Google already has access to a fast-growing genomic database, which could come in handy as it begins to focus on, in its words, “health and well-being — in particular the challenge of aging” and dive into the science, genetics and biochemistry behind longevity and disease.

In an interview with TIME Magazine, Google CEO Larry Page implied that dramatically extending human life is one of Calico’s main goals; not making people immortal per se, but, according to a source familiar with the project, increasing the lifespan of people born 20 years ago by as much as 100 years.

“Are people really focused on the right things?” Page muses in the interview. “One of the things I thought was amazing is that if you solve cancer, you’d add about three years to people’s average life expectancy. We think of solving cancer as this huge thing that’ll totally change the world, but when you really take a step back and look at it, yeah, there are many, many tragic cases of cancer, and it’s very, very sad, but in the aggregate, it’s not as big an advance as you might think.”

While his delivery is a bit confusing, Page’s thesis seems to be an optimistic one: While curing cancer has always seemed like an insurmountable obstacle, the goal is more within reach than many believe — if only someone would just put their mind to it, he seems to say. Yes, Larry Page is brilliant, but his message also seems to imply that diseases (and their cures) are reducible — that all of the world’s problems could be cured if we just had some snappier algorithms. [Sir Alexander (Fleming) did not (have to) use any "algorithm" for serendipitously coming up with penicillin, nor there is any algorithm to aspirin - AJP). The Genome is, however, a "digital code" of life (more precisely, of hologenome regulation, based on the principle of recursive genome function). The expression of "cracking the DNA code" has been (ab)used far too many times. In fact, it has only been revealed - "cracking" of e.g. encrypted messages is a procedure that nobody even remotely familiar with mathematics would think of in any other way but by algorithms programmed into high-performance computers. Larry Page, mastermind of "Page ranking", see their brillilantly lucid core-patent, is absolutely right on target (along with Dave Haussler of USC etc, etc) that "cancer is a digital disease that will have a digital cure". - AJP]

Really, it almost seems more of a reflection of how the enormous many-headed-beast that is Google has become, as well as a testament to its resources and the type of talent it’s able to attract — rather than pure, unbridled hubris. It’s as if they’re saying: “Hi, welcome to Google! Today, we’re going to turn your eyeglasses into a computer, tomorrow we’ll develop self-driving cars, the day after that, we’ll cure cancer and increase the average human lifespan by 100 years. Oh, and by the way, we’re still trying to organize all of the world’s information and make it easily searchable!!” N.B.D., everybody, N.B.D.

Interestingly, Calico doesn’t seem to be a Google company per se, more of an investment in a new company that will be affiliated with Google and become an extension of the company’s mad science lab, Google X. “Don’t be surprised if we invest in projects that seem strange or speculative compared