Posted: June 7th, 2011 | Author: Manuel Lima | Filed under: Uncategorized | No Comments »
I first read about the work of Geoffrey West on “superlinear scaling” in Steven Johnson’s Where Good Ideas Come From. West is a British theoretical physicist, former president and distinguished professor of the Santa Fe Institute. His long-term fascination in scaling phenomena led him to look for universal scaling laws that pervade not only biology, from the molecular genomic scale up to whole organisms and ecosystems, but also social structures, in particular cities and companies.
Just yesterday I was glued to my laptop screen for 52 minutes, listening to West’s fascinating monologue on the underlying principles that govern biological, urban, and business growth. His talk from Edge magazine is truly captivating, delivering a convincing framework for universal scaling. Here are two short passages by West:
I think this is very much science of the 21st century, because it is the kind of problem that scientists have ignored. It is under the umbrella of a complex adaptive system and we need to come to terms with understanding the structure and dynamics and organization of such systems because they’re the ones that determine our lives and our extraordinary phenomenon that we have developed on this planet.
[Cities] are the origin of the problems, but they are the origin of the solutions. And we need to come to terms with that, and we need to understand how cities work in a more scientific framework, meaning to what extent can we make it into a quantitative predictive, mathematizible kind of science.
Posted: February 27th, 2011 | Author: Manuel Lima | Filed under: Uncategorized | 2 Comments »
I continue to be utterly fascinated by Amira Skomorowska’s tweets, notes, posts, and findings. Here is one of the many absorbing finds:
Left brain: I am the left brain. I am a scientist. A mathematician. I love the familiar. I categorize. I am accurate. Linear. Analytical. Strategic. I am practical. Always in control. A master of words and language. Realistic. I calculate equations and play with numbers. I am order. I am logic. I know exactly who I am.
Right brain: I am the right brain. I am creativity. A free spirit. I am passion. Yearning. Sensuality. I am the sound of roaring laughter. I am taste. The feeling of sand beneath bare feat. I am movement. Vivid colors. I am the urge to paint on an empty canvas. I am boundless imagination. Art. Poetry. I sense. I feel. I am everything I wanted to be.
Posted: January 26th, 2011 | Author: Manuel Lima | Filed under: Uncategorized | No Comments »
It’s been more than a year since I started this long exciting journey, 17 months to be precise, and I’m very happy to announce today that the book is finally available for pre-order from Amazon, Barnes & Noble, Waterstones (amongst others). With a foreword by Lev Manovich, VisualComplexity: Mapping Patterns of Information is a deep dive into two major disciplines that witnessed a meteoric rise in the last decade: network science and information visualization.
The book is far more than a showcase of VisualComplexity.com. In fact, around one third of the estimated 300 featured images have never been showcased on the website. Due to its broader scope, it does actually seem that the website emerged from the book, other than the other way around. The book yields a comprehensive view on the visual representation of networks, delving into historical precedents (with some beautiful medieval imagery), various contemporary subjects and methods, and a range of future prospects. It looks at the depiction of networks from a practical and functional perspective, as a key driver for understanding the complex connectedness of modern society, but also explores the alluring qualities of the network schema, responsible for a considerable shift in contemporary art and culture. Divided in 7 chapters the book is ultimately a testimony to the enthralling power of networks and visualization.
In July 2009 while still working for Nokia I signed a contract with Princeton Architectural Press, for what would be one of the most challenging experiences of my life. I was immediately concerned with the deadline ahead of me, since I knew how much needed to be done. On Saturday, July 25, when riding the train back home from TEDGlobal in Oxford, UK, and after being inspired by every single person and story in that conference, I quickly decided the only possible way I could accomplish the book was to embrace it full-time. The following Monday I communicated my decision to Nokia and in late-August I left the company.
What followed were months of hard, intense work, countless hours, more than 1200 emails, and many ups and downs. I had an idea it wouldn’t be that easy, but I never expected it to be so hard. I guess it doesn’t help when you’re too meticulous, self-critical and demanding. A lot has changed in my life since then. I got married only a few months after, and more recently started a new job, moved to a different city in a different country. Coincidentally I’m now living a few blocks from Princeton Architectural Press, a publisher I came to care and respect like no other. The last couple of months have also been very rewarding, since you finally see all loose pieces of the book come together into a coherent whole with a unique personality.
There’s a lot more to say, and I promise I will keep posting more updates, interesting facts, page samples, final cover design, and related events in the coming months. But in the meantime, if you want to make sure you get the book as soon as it comes out in late summer, you can pre-order it on Amazon. Pre-orders are also a great way to support an author, since it shows advance interest in the book. You can also follow updates on Twitter or Facebook.
Posted: January 25th, 2011 | Author: Manuel Lima | Filed under: Uncategorized | 1 Comment »
When I first started grouping projects in VC by visual method, in June 2007, radial convergence was already the most popular group with roughly eight projects. You can see that early classification in a now-extinct page of VisualComplexity.com, back in the day of June 10, 2007 (thanks to WayBackMachine):
As you can see from the image above, those eight radial convergence projects were amongst the first to be indexed in VC. Interestingly enough, three of them, respectively AS Internet Graph (2002), GNOM (2005), and Circos (2005) are amongst my favorites to this day. Although I had started talking about this method in conferences like MeshForum (San Francisco) and reboot 9.0 (Copenhagen, Denmark) it remained nameless for a while. The label came out from a need to classify this and other layout types within the growing collection of projects. Since the model is essentially defined by a radial ordering of items converging with each other, the title radial convergence became an intuitive fit. However, it was hard to predict the method would take off as much as it did. Within the last three years it has become immensely popular, and it seems that with every batch of new projects added to VC there’s always one exhibiting this favored layout.
There are probably many reasons that can explain this popularity. First, it’s a simple execution. It’s probably one of the easiest and fastest ways to trial or visually convey a relational database. Second, it’s remarkably alluring. Humans have a widely known and documented obsession with the circle and many of its iconographic qualities that have been revered through millennia, such as divinity, perfection, unity, or closure. Third, if we add to the previous reasons the growing availability of data, number of visualization enthusiasts, and easy-to-use software, then we have the perfect conditions for growth, multiplication, and increasing popularity.
Currently there are 33 projects under radial convergence in VC, mapping a variety of subjects, from IP addresses to Facebook friends. Here’s a screenshot of all of them, as of January 24, 2011.
Posted: November 2nd, 2010 | Author: Manuel Lima | Filed under: Uncategorized | 2 Comments »
It has been a while since I’ve posted anything on this blog or in VC. Here’s why:
In mid-July my wife and I left London, after 4 great years living in the English capital. Our last days were packed with parties and plenty of drinks. We moved to Portugal together with roughly 580 Kg (35 boxes) of our stuff - with books being the heaviest category at 220 Kg (to my wife’s annoyance).
Portugal (Lisbon, Batalha, Ponta Delgada, Albufeira)
August and September were spent in different areas of Portugal, enjoying the sun with our family and friends. During this period I also finished the second revision of the VC book. After a long recruitment/visa/relocation process, the day of the big move finally arrived.
On October 18th, 2010, I joined Microsoft Bing as a Senior UX Design Lead, and after a few days of orientation in Redmond, we finally moved to New York City where we’re now living in our temporary apartment.
We’re still getting used to the NYC lifestyle and slowly losing our tourist badges. New York City will be our home from now on, so in case you want to meet or reach me in any way, you know where to find me.
Future + Bing
You can certainly expect more regular updates on VC, now that my life is settling again. In regards to the VC book, it’s currently on its 3rd revision and will be available for purchase next year (more information to come soon). After one year of writing, researching, speaking and occasional consulting, it’s quite stimulating to roll up my sleeves once again and join the great design team at Microsoft Bing. I’m very excited with this new challenge and all its future possibilities. Interesting times ahead…
Posted: November 1st, 2010 | Author: Manuel Lima | Filed under: Uncategorized | No Comments »
The first time I tweeted about RSA Animate was in July 2010, when I posted about the canny Crisis of Capitalism video. Since then the Royal Society of Arts (RSA) has released a few more videos which are remarkable examples of visual storytelling. If you haven’t seen any of these highly addictive pieces you don’t know what you’re missing.
See all videos here:
Posted: July 26th, 2010 | Author: Manuel Lima | Filed under: Uncategorized | 13 Comments »
This is an original guest post by Ricardo Nuno Silva for VC Blog. Ricardo is a Portuguese applications developer with a longtime curiosity concerning the impact of digital technologies in everyday life. You can contact him at firstname.lastname@example.org.
In the last few years many tools and techniques have been developed to help us visualize songs, music and sounds. This post is a showcase of some of these greatest tools. Each one is focused on a particular aspect of this challenging type of visualization.
One of the most common examples of software for sound visualization is the one used in media players. But they usually only translate sound frequencies to shapes and colors on the screen. They’ve been used extensively for leisure, relaxation and dance parties.
The tools in this showcase have a different approach, as they truly “understand” music in its individual notes. Some can be used in real time, while others need to do some number-crunching while analyzing each song.
Below each image there’s the name of the tool or technique, some great video examples, and a link to the author’s site.
If you know other great tools or videos of music visualization, please leave a comment below or via email. Thank you!
Static Visualization of Songs
The Shape of Song by Martin Wattenberg
Narratives 2.0 by Matthias Dittrich
Similar to Sheet Music
Don’t Be Sad by Brad Mehldau
MIDI Music Visualization Videos for Deaf and Hearing Impaired People by Eric Rangell.
Music Animation Machine (MAM) by Stephen Malinowski. See: Beethoven 5th Symphony.
Visualization of Instruments Output
Clavilux 2000 by Jonas Heuer.
Celeste Motus by the Abstract Birds. (via Pedro Custódio)
MuSA.RT - Music on the Spiral Array. Real-Time by Elaine Chew e Alex François.
TypeStar by Scott Garner.
Learning Games through Visualization
Synthesia (for piano) by Nicholas Piegdon.
Animating Virtual Instruments
MIDIJam by Scott Haag. See: MidiJam (I just died in your arms).
Pipe Dream by Animusic. See also: MIDIJam meets Animusic: Pipe Dream
Ljósið by Ólafur Arnalds
Just Colour by Jesper Brevik
See other music-related visualizations @ VisualComplexity.com | Music.
Posted: April 19th, 2010 | Author: Manuel Lima | Filed under: Uncategorized | 2 Comments »
Ocean explorers are puzzling out Nature’s purpose behind an astonishing variety of tiny ocean creatures like microbes and zooplankton animals – each perhaps a ticket-holder in life’s lottery, awaiting conditions that will allow it to prosper and dominate.
The inventory and study of the hardest-to-see sea species — tiny microbes, zooplankton, larvae and burrowers in the sea bed, which together underpin almost all other life on Earth — is the focus of four of 14 field projects of the Census of Marine Life
The results from the latest census
revealed spectacular examples of hard-to-see underwater microbes, available in this stunning gallery
of some of the smallest sea species.
Posted: April 18th, 2010 | Author: Manuel Lima | Filed under: Uncategorized | 4 Comments »
flame dragon, by peter blaskovic (created in flame painter)
As I was organizing my RSS feeds in feedly, I stumbled upon Gert K. Nielsen’s piece on Visual Journalism, written in March 22, 2010. The venturesome title of Nielsen’s post was “The next big thing in infographics - five criterias and a solution“. Intriguing and stimulating. I was immediately on board. That is until I started reading his five recommendations and final proposed solution. You should read it and take your own conclusions, but I found Nielsen’s piece absolutely bewildering.
- The first recommendation, on the need for computer generated infographics, reads more like a natural progression of the field rather than a recommendation, and is perhaps the most blunt of the list. The second and third criterias are on the other hand a bit more disconcerting.
- “It must be beautiful”, Nielsen says in the beginning of his second suggestion. Nothing wrong with that, but you would expect some reflection on the benefits of aesthetics to follow that statement. However, Nielsen appears to be infatuated with aesthetics solely for its popularity… As he explains, “right now the interest is on presentation much more than the content”.
- But the third criteria is even more baffling. “It has to be somewhat ambiguous”, states Nielsen. Yes, take a deep breath and read it again. And perhaps like me, you’ll wonder, what? But wait, Nielsen immediately comes to our rescue, fundamenting his view with a remarkable argument. “Describing things in black and white and sharp vector lines is too fanatic. Blends are much better suited to describe a complex situation”. Yes, let’s reconsider this fanaticism for objectivity, clarity and content. The future of infographics is ambiguousness!
- (I didn’t quite understand this point, so if someone does please explain.)
- Moving on to his fifth criteria, since I couldn’t grasp the fourth, Nielsen asserts “It needs to work in online presentations too”. This could be an interesting starting point to an analysis on the different contexts of use of infographics and the variety of platforms it could explore, but Nielsen falls short in his explanation, merely stating that infograthics could be integrated in online presentations “perhaps by moving or evolving over time”. A very light investigation, to say the least.
But perhaps the most disquieting part of the post was the solution proposed by Nielsen for the future of the field. As he explains: “The solution I came up with is particles in 3D-programs“. Brilliant! According to Nielsen, there’s no particular downside to 3D particles (think about clarity and legibility), apart from its demanding learning curve, or in other words, the time it takes to learn these “really tough concepts”. In his pursuit for ambiguousness it’s not entirely surprising that Nielsen fails to consider any other drawback to his formula. His proposed solution becomes slightly more tangible, when he presents an example of this vision: Flame. As he explains “the ability to paint with ‘flames’ fits right into my expectation of seeing graphics with an appearance that fits the current times”.
I will not expand too much on how I find this view seriously distressing, since I’ve done it before and again. But this leads to the growing confusion that Robert Kosara alludes in his latest post, The Visualization Cargo Cult. Gert Nielsen’s post, as puzzling as it might seem, is a reflection of a seriously disturbing view, that sees objective infographics as a thing of the past, and appealing ambiguousness as a much better fit for the “current times”. I just hope it doesn’t become a contagious meme.
Posted: April 13th, 2010 | Author: Manuel Lima | Filed under: Uncategorized | No Comments »
NYT - Obama’s 2011 Budget Proposal: How It’s Spent
Rectangles in the chart are sized according to the amount of spending for that category. Color shows the change in spending from 2010.
A zoomable treemap for the life records of the Natural Science Museum of Barcelona, by Bestiario.
Posted: April 12th, 2010 | Author: Manuel Lima | Filed under: Uncategorized | No Comments »
Most VC readers must already be familiar with Data Flow 2, the most recent number of the growing Data Flow family, published in February 2010. Featuring several interviews with New York Times Graphics Editor Steve Duenes, Art+Com Director Joachim Sauter, and one with Andrew Vande Moere and myself, the book is an inspirational compendium of hundreds of projects. The work presents itself as a portfolio book, featuring an array of innovative approaches (many featured in VC), which are incredibly provocative and inspiring. Due to its coffee-table nature, the title doesn’t aim at an in-depth analysis or theoretical reflection on the displayed projects and defined categories, but acts primarily as a stimulating showcase of ideas.
As Andrew Vande Moere eloquently states in his review, the foreword doesn’t quite align with the book’s content, since most of its assertions for insightfulness are not necessarily substantiated in the variety of executions showcased throughout the book. Nonetheless, Data Flow 2 is a great source of inspiration for anyone working in the domain of data visualization.
Posted: April 7th, 2010 | Author: Manuel Lima | Filed under: Uncategorized | No Comments »
A very appealing spline based 3D form in Processing that represents the bass frequency and puts it into motion.As Christian Bannister explains:
What would the bass look like? What would it be like to touch it and manipulate it directly and visually in real-time? These are some of the things I am trying to get at in this sketch.
Posted: April 7th, 2010 | Author: Manuel Lima | Filed under: Uncategorized | No Comments »
Unfortunately the Call for Participants is now closed, but nonetheless this initiative should be interesting to follow. Synthetic Aesthetics aims to bring creative practitioners and those who are expert at studying, analyzing and designing the synthetic/natural interface together with the existing synthetic biology community to help with the work of designing, understanding and building the living world.
From this thought-provoking premise:
Biology has become a new material for engineering. From the design of biological circuits made from DNA to the design of entire systems, synthetic biology is very much interested in making biology something that can be designed.
The project asks:
Can collaborations between synthetic biology and design inform and shape the developing field of synthetic biology—the engineering of new and existing biological entities for useful means? What insights can design offer in designing microscopic entities for a human-scale world? Can design learn from synthetic biology?
Posted: February 10th, 2010 | Author: Manuel Lima | Filed under: Uncategorized | 2 Comments »
(Originally written in May, 2009 - Extinct section in VC)
Recursive cycles of innovation happen within many different areas, most notably in the domains of Art and Design. Even those who are completely out of touch with fashion can observe how the field is constantly rediscovering its past, recycling ideas, and incessantly mixing new trends with old influences. In Economics, it’s a well-known fact that business trends and stock markets expose processes that tend to repeat itself in a more or less regular fashion. In this domain, the most notorious study was developed by Nikolai Kondratiev, a Russian economist who observed a series of long 50-year cycles in modern world economy. The Kondratiev waves, as they were later called, consist of alternating periods of high and low sectoral growth, which for most cases have proved to be accurate since the end of the 18th century. In fact, from the different angles you can look at history, there’s always someone who is ready to point out a specific recursive cyclical pattern.
But even though we are accustomed to this type of process in many fields, we always think technology, and in particular the computing industry, is immune to it. After all, our technological progress is made of several vertiginously rising paths that share the absence of a review mirror. There’s no point in looking back, or even considering that some aspects of the past might re-occur, simply because there’s nothing to learn from it. Nonetheless, the act of uncovering patterns and potential cycles, particularly in an industry that prides itself of its continuous fresh innovation, is an extremely appealing exercise.
The pattern I’m about to describe is divided in 3 periods, starting at the foundation of computing history and ending with a set of strong indicators of a third new cycle. It tries to make the case that even though individual technological components evolve in a really fast and unique pace, the way in which they interrelate and behave might follow some level of cyclical occurrence. Separating these 3 stages are roughly two periods of 25 years. The first cycle started in the late 1950s, with the widespread of the mainframe-computing model, followed by the second stage in the beginning of the 1980s, with a succession of events that lead to the emergence of the highly powerful laptop computer. Finally, the latest and most recent cycle has just started. Lead by Cloud Computing and the Netbook phenomenon, everything seems to indicate this will be a major movement for many years to come. From an initial centralized model, to a dispersion of increasingly independent machines, the new drift foresees storage and computing drainage from many portable computers and the return to a model based on data centrality. The main distinction this time is that instead of the mainframe, the “Cloud” emerges as its central interconnected hub. Although recurring cycles might be a noticeable pattern for how data is stored and accessed, there’s still a unique common thread to all these stages: a continuous straight progression towards mobility.
First Cycle | The Central Mainframe
Characterized by one central computer, responsible for most of the storage and processing power, linked to a series of satellite terminals, the mainframe-computing model has been a key protagonist in the history of the modern computer since the late 1950s. By then people accessed and interacted with immensely large mainframes through a variety of linked terminals that suffered significant changes over time. From early punchcards and teleprinters, to latter video computer displays with their familiar green and amber screens, interactive computer terminals through the 1960s and 70s had one thing in common: its powerless unintelligence and dependency on the crucial mainframe.
Second Cycle | The rise of the laptop and its Portability Effect
By the end of the 1970s, specialized terminals, as the precursors of modern-day portable computers, were becoming smarter. Initially packed with terminal emulation software, these machines were detaching itself from the almighty mainframe and becoming self-sufficient entities with their own processing capability. This process opened the path for the desktop computer, with early pioneers like Apple II and IBM 5150 leading the way. The course of computing mobility had just started and it would be just a matter of time before laptop computers started to materialize and eventually replace desktop computers.
In most part, the computing industry in the past 25 years has seen laptops dramatically increase their computing power and rival traditional desktop PCs. In 1986, battery-powered portable computers had about 2% of market share worldwide. Today there are more laptops than desktops in businesses and general use, and in 2008, more laptops than desktops were sold in the US. Even though some mainframes have evolved into the supercomputers of modern age, uncovering important aspects of science, like the structure of cosmos or the vast neuronal network of the human brain, the true hero of this story is the laptop. These tiny compact boxes have become potent full-fledged machines with the added benefit of portability – an essential attribute in an increasingly mobile world. But how long will the mobile processing power rush last? Or has it in fact reached a tipping point? Will the hero of the last decade be partially or entirely replaced with its new weaker adversary: the Netbook?
Third Cycle | Cloud Computing: The Personal Mainframe
Cloud Computing is seen as the next computing trend and the key driver for the third cycle of data centrality. It can simply be described as a “style of computing in which dynamically scalable and often virtualized resources are provided as a service over the Internet”. This model is extremely in tune with our contemporary lifestyle. We currently access the web through a variety of devices, with different features, shapes and sizes. And while the amount of access points keeps increasing, the ability to synch content between them is still an immense headache, where rarely (if ever) we see a satisfactory user experience. The Cloud paradigm substantially alleviates this problem, by increasingly relying on services, information and applications stored on online servers – the vast Cloud landscape – that can then be accessed at anytime from anywhere, as long as there is an online connection.
On an enlightening special report by The Economist, entitled “Let it rise”, it’s asserted that the Cloud is already a common phenomenon, where 69% of Americans connected to the web use some type of “cloud service” including web-based e-mail or online data storage. Many companies are following this feverish movement and in the same article by The Economist, Irving Wladawsky-Berger compares it to the Cambrian explosion some 500m years ago, when the rate of evolution speeded up, in part because the cell had been perfected and standardised, allowing evolution to build more complex organisms.
Another indicator of this turning point is the Netbook. In part driven by a global economic downturn, the Netbook phenomenon might prove to be a long-lasting craze. Characterized as a lightweight, economical and energy-efficient laptop, especially suited for wireless communication and Internet access, this new mobile computer has been all over the news lately. In a recent article from Newsweek magazine, they uncovered a growing market trend in Japan, as more consumers are opting for netbook computers. While PC sales went down 4 percent on the fourth quarter of 2008 in Japan, sales of netbooks shot up 43 percent. The recession has been an important driver for this consumer shift, since people have become more sensitive to price, but the growth of cloud computing is its vital ingredient. There’s also an undeniable rational deduction behind this behavioral change. Many users start questioning themselves if they actually need all the bustling speed and storage, when their computers are mostly used for emailing and web browsing.
On the diagram shown above we can observe a series of laptops and netbooks linked to a central Cloud, which is in turn surrounded by a multiplicity of abstract devices. Many of these future devices will not require a vast processing capability, since they will work as rendering windows for the same online service - flowing incessantly through all of them. The role of “windows for services” might be what awaits many future mobile computers, including mobile phones. This points out to the growing value and significance of online services and applications, as the vital glue across many systems and platforms.
Predictions always feel like empty promises and there can be no certainty on what the future holds. Is the Netbook a predecessor of a future class of dumb terminals entirely dependent on the Cloud? Is Cloud Computing really going to be the next big thing? If so, how long will it last? Will it prove to be a long lasting shift or will people grow increasingly wary of their privacy and lack of ownership and return to a similar model as we have today, and in the process instigate a fourth cycle of data centrality?
Posted: February 8th, 2010 | Author: Manuel Lima | Filed under: Uncategorized | No Comments »
In the end of January I was in Paris for a couple of events: a talk at ESAG - École Supérieure de Design, d’Art Graphique et d’Architecture Intérieure and one-week workshop at ENSAD - École Nationale Supérieure des Arts Décoratifs. The lecture at ESAG was great and the lengthy discussion that followed, with the packed audience of inquisitive students, was extremely involving.
The workshop at ENSAD was a longer and very fruitful engagement, part of IDN: Identité numérique mobile (DMI: Digital Mobile Identities), a new program of ENSAD Lab - a research unit for creation and innovation gathering graduate research students and professionals to collaborate and discuss on the contemporary challenges of design. Lead by Remy Bourganel and Etienne Mineu, IDN is meant to investigate the flows, the emerging patterns and representations that qualify a new digital mobile identity. In this context, students at the workshop explored different ways of analyzing and visualizing different social dimensions relevant to them. Some of their initial studies can be seen here.