Rana June, Technologist

Rana June, aka the iPad DJ, has performed at The White House Correspondents’ Dinner, Apple’s WWDC, TED, and The X Prize Foundation. Her experience in front of 200 different venues totaling over 100,00 attendees led Ms. June to explore the use of real time audience analytics. Her company, Lightwave, is pioneering a biometric platform based on the use of wearable technology in concert.

MBJ: How did you become involved with technology?

I was fortunate because I grew up in the D.C. area where there was both a vibrant music scene and a strong technology sector. From an early age I held both artists and technologists in high regard. At the exact moment I started playing music I bought my first Powerbook, so I was using a digital audio workstation on the laptop while learning music. The integrity of the experience was never lost on me and production has always been part of my music making process.

My job in college was working at Guitar Center, which gave me the opportunity to be around musical instruments all the time. Every free moment I had I would spend trying out new digital tools, like Fruity Loops, Logic, and plugins of various kinds — a perk of the job. I had a studio in my house with instruments and computers but it was strictly after hours work. Now it is coming up on fifteen years! My interest in electronic dance music also helped. EDM has always been a popular art form amongst coders and technology has greatly influenced it.

MBJ: How did the iPad come into it?

It had to do with me being a guitarist. I could plug in a wireless guitar bug and be mobile anywhere on stage. My understanding of the audience, of how to interact and put on a great performance, was dependent on me being unchained. When I first started focusing on electronic dance music, I found it very jarring that the creative potential of a computer was in fact restricted. It just seemed counterintuitive that a DJ had to be stuck behind a desk.

When the iPad came out in 2010, the seed in my mind had already been planted. I had pre-ordered Apple’s first iPad and was standing in line at the Apple Store in New York City. The day before the launch Apple had released the iPad App Store, and I used my iPhone to check it. I quickly realized there was an entire section of the store just for music and scanning the few apps that were available I understood, like a bolt of lightning, that this was going to change music production and performance forever. The area was ripe for disruption. I went on to purchase two iPads and combined them with a basic Numark mixer, which I really hacked together, to start my own way of DJing.

At the time I was part of the IOS development community, and familiar with the events happening in that space. A conference, the iPad Dev Camp, was scheduled to meet in San Francisco and I had already told the organizer about an idea I was working on. The event was closed and intimate, with around 300 people in attendance, so I was invited to debut my playing concept. I performed a fully improvised thirty-minute set using existing apps. The events took off. Somebody in the audience sent a video to another technologist. The technologist interviewed me and published the video with the heading The iPad DJ and it quickly got up to 200K views. Then things changed suddenly the next day, when Gizmodo, a blog site, leaked that a prototype iPhone was left in a bar, upsetting Steve Jobs. As traffic to that site grew, I got in touch with the editor, a friend, who had seen my performance video on Twitter. He posted a picture and link on the main page. The video quickly went from 200K to 1M, unheard of in 2010. From that point forward I started touring. In two years, I played over two hundred shows only using tablet computers. Over time the rig became very sophisticated, using wireless technology and over six iPads, to create a completely tethered experience as a performer. I think that is what captured musicians’ imagination. I didn’t have millions of dollars in production, just iPads.

MBJ: What is your vision for live music?

We live in a world that is fact driven. Media, corporations, and individuals consume information. Data analytics are rife. One of the few areas still unexplored is the production of human markers from audiences in public events. I invented Lightwave’s technology because I wanted to understand audience reaction there and then, at the concert, not after from Twitter.

It depends on fans using wearable technology to track their emotions live (see MBJ, ‘Your Heart on Your Sleeve’, May 2015). I wanted to know if a song was landing with the audience, or if the excitement was too intense before a later climactic moment so an adjustment could be by the artist in time. This information can shape not just the way we interact with entertainment audiences, and the business of music in particular, but help other ventures that can benefit from the ability of knowing what the emotional feedback of their consumers is there and then. Direct to Consumer industries can benefit from research on this type of data, as does the film industry already when they run focus groups to adjust scenes based on the immediate response of captive audiences prior to the release of what are usually big dollar productions. Getting Lightwave in the hands of artists is something we are extremely excited about because of its potential to change live music performance.

 

By William Kiendl and John Lahr


email

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *