Live 360°

An interview with David Robinson, Executive Producer, Total Media VR (TMVR)


1. What kind of solutions do you offer?

Our business model enables us to react to specific needs and create custom solutions for different engagements. As engineers, we favour finding the best path for our clients – whether that means using available resources or developing new technology whilst integrating with our existing production footprint. We’ve also focused on creating one umbrella product platform that we can enhance, market and reliably deliver, time and time again.

We swap in and out a selection of current market camera technology, favouring systems that complement one another. This allows us to create multiple camera solutions that can be connected and controlled in a traditional broadcast environment. We need to be able to guarantee quality and reliability so that they can integrate with our existing hardware to utilize existing architecture and infrastructure. From there, we can offer a product that is familiar to any client’s workflow.

 

camera-dev
David Robinson, Total Media VR

 


2. Who is most interested in 360° / virtual reality (VR) and Live 360°?

Our clients come from music, entertainment, sports, corporate and advertising agencies. They have all shown growing interest in 360° live video, VR and augmented reality (AR). Our aim has been to fuel their imagination and provide a broad array of new solutions from our partner group to offer to their clients.

We’re also expanding into new areas through a strong relationship in the pharmaceutical sector. We’re working on developing hardware for a major brand in 2018 to focus on expanding the traditional content creation landscape in their space.

chopper-dev

 


3. Is 2D obsolete if you capture in 3D? Is there a way to integrate 2D assets into 360° / VR?

No, far from it – 2D is not obsolete. In 2017 Grammy award winner Christian McBride invited us to shoot the last night of his concert series in New York at the famous Village Vanguard. Working with GraysonX and HEAR360, we deployed both 2D and 360° cameras in the space combined with HEAR360’s 8ball microphone. The process brought together a 360° background embedded with close up angles from our 2D cameras as picture-in-picture assets. With the spatial audio from HEAR360 underpinning the visual immersion, we were able to deliver a rapid real-time deployment of an augmented experience. Joining the two mediums in a collaborative deliverable added a completely new production perspective to the value of the piece.


4. Can you get the same picture quality in 360° / VR as you do with a traditional broadcast?

It depends on the cameras, location and all the factors that affect capture. The quality of the camera technology has a bearing, together with available light and position. If you’re shooting a TV show, you’re able to paint and shade cameras to match – 360° cameras are no different. In the post-world, there are tools that allow you to manage colour correction and augmented quality.

When you’re working in live production, you’re trying to achieve something that is going to be instantly enjoyable and reflects a consistency without disturbing the visual experience. That’s something we are very familiar with. We have explored ways to manage consistency reliably and work with camera technology that compliment the work-flow.

 

broadcast


5. What are the challenges and / or limitations of working in 360°?

The challenges are reflected by the infancy of the market, the ever-changing hardware, technology and appetite. It’s more about whether you can work with 360° capture reliably and consistently. We are there to ensure everything goes seamlessly and smoothly. The general interest and perception of 360° is all about the head mounted view. This becomes a limitation and in some cases a detractor. It’s forcing companies like ours to push to change perception and move users to engage with content differently. Instead of locking them into their own world, we take the experience to giant interactive screens, domes projections, on to mobile devices inside a social media stream or over-captured into a traditional 16×9 frame.


6. Can you do 360° / VR broadcasts or Live 360° anywhere, or do you need a special broadcast facility?

Yes, we’ve produced broadcasts from very obscure locations. The general rule of thumb is to retain the quality expected of the production and the broadcast. When you start shooting in 360°, you’re increasing the number of cameras, therefore you’re increasing the actual frame size and resolution to broadcast. As a result, you’re looking to stream more data than you were previously. In the end, a lot of your simpler solutions start to fall away. What we’ve done is to start looking at ways we can push higher resolution via differing mechanisms and processes. In order to hit these designations, we’ve built dedicated fly-packs and portable, more manageable form factors. Anything from a full-size outside broadcast (OB) truck down to a 5K multi-capture system that can travel in the overhead compartment of a domestic or international flight. We can broadcast VR / 360° from anywhere as long as we can source the kind of bandwidth to meet the quality expectations of the live production.

forest-shoot

 


7. Brands want to share content via social media channels, because that’s where they frequently engage with their audience. How can you plug in and stream to popular social networks like Facebook and YouTube?

Our team members have had a long relationship with Facebook through their client, Telescope. As an engineering and production partner, Total Media VR (TMVR) support a large number of their live streaming needs. Amongst its many firsts and award-winning engagements, Telescope invented the “Donate Button” for Facebook, coding the API that Facebook uses across all of its charities and philanthropic ventures streamed online. Recently, we’ve been testing how to take a 360° live experience in 4K and stream it directly to Facebook. Working closely with the Telescope team, we successfully embedded dynamic audience interactions; moderated chat; and handled polling and event statistics direct from the Telescope environment into the 360° world of a live stream, in real-time. This example demonstrates how social media plays an important role in bringing the familiar engagements into the users stream and utilising the 360° environment to create a more compelling call to action with additional augmented content.


8. Can you stream or broadcast to all the social networks at once?

We can use a number of tools at our disposal to stream simultaneously to supporting social media networks. Our encoding platforms are built to support redundant streams to single accounts or multiple streams to a number of different accounts across familiar networks such as YouTube, Periscope and Facebook.

concert

 


9. What’s the most unique thing about your offering? Any world firsts?

The most unique thing about TMVR is the way we innovate. It’s all about our eagerness to engineer and our ability to ideate. We thrive on being challenged by our clients, from the most basic request, such as hosting a small production with one or two cameras, to creating our own capture systems or customizing to meet the requirements of the creative. For example, we built a rotating frame for the director of GraysonX’s Alice Phoebe Lou project. The frame could wind up an orbiting camera mount so that as the structure moved it would set another component in motion to create a constantly spinning point of view (POV) around the central axis of the subject. In this case, the subject was the talent, as she performed the camera turned around her, creating a unique 360° experience.

Working alongside GraysonX on SESQUI’s ‘HORIZON’ film, we were the first to create wireless 360° real-time on set confidence monitoring solutions. TMVR built a wireless product that enabled the primary 360° camera array to be viewed as a stitched view, from which the director could review and direct his shots remotely in real-time.

Partnering with Telescope, we were also the first to embed dynamic moderated chat and polling data into a 360° Facebook stream. We’re also currently working on long-range wireless POV 360° camera for head-mounted 360° products in sports, entertainment and industrial applications.


10. Where do you see the 360° / VR / AR going in the future? Will everything be live?

I think audiences are becoming more familiar with 360° content. Using destinations such as Facebook in a more traditional setting, such as a phone or a tablet rather than a head-mounted display (HMD), increases the engagement and awareness. It also broadens the creative opportunity to include augmented development.

Currently, we will focus our attention on live streaming and broadcast. There are opportunities – particularly in social media – where live interaction goes further than just watching. Actual engagement in real-time delivers an increasingly more compelling user environment. For instance, simple movement in the space to engage in different aspects of a news feed creates a new opportunity to ideate. Personally, I think combining AR and 360° together and delivering it live from a social platform with audience engagement, creates the most compelling direction for the medium. TMVR is aiming to embolden this step and push to open the user experience to engage a broader interest and a more expansive call to action.


 

The Importance of Spatial Audio

An interview with Hear360’s Founders: Matt Marrin and Greg Morgenstein


1. How did you go from being engineers and producers to launching HEAR360 and creating audio technology?

Matt: Greg and I have been working in the music business for over 20 years. I started working in recording studios in Los Angeles and moved up through a traditional system. I started as a runner, taking food and drink orders, and then moved up the ranks as an assistant to the engineers in recording sessions. If you’re fortunate enough, you start working with those engineers and start producing and engineering records on your own. I met Jimmy Jam and Terry Lewis and started working with them exclusively, recording a lot of their projects – Janet Jackson among many others.

After you are in this world for a long time, people start to recognize that you have a lot of experience with sound and audio and a number of opportunities come up. That’s what happened for Greg as well. New start-ups and companies in the audio technology world are looking for experienced audio professionals that can evaluate, critique and help develop their technology.

Around 2007, Greg and I started consulting for a spatial audio company that was developing software for headphones and soundbars. That led us to testing and tuning prototypes like sound systems for cars with spatial audio. I consulted for SONOS evaluating the sound of their products when they were in early and late-stage development, and later started consulting for DTS. They were looking to expand their spatial headphone system and needed people with mixing experience to put together demos, and to evaluate and make observations on improving those systems.

Through the path of consulting and talking to companies about how to improve their products, we started coming up with our own product concepts and realized we could do something on our own. We launched HEAR360 and decided to make those concepts a reality.


2. Briefly, why is spatial audio important and how does it work?

Matt: There are different ways to experience immersive content – in a virtual reality (VR) scenario where you’re in a head-mounted display (HMD), watching something on a mobile device in 360°, or watching on your laptop in 360°. When you’re watching an immersive video you can look around, experience different aspects of it and control your experience. If the audio doesn’t have a head-tracking component then you’re not really getting a fully immersive experience. You’re only getting 50% of it. In these scenarios, in order to sell the consumer on the experience, you have to have 100% otherwise they won’t buy it.

People often overlook sound and consider it last. Most people pay attention to the visuals and think it’s the most important part. We think video is only half of the experience. Even though a user may not be able to articulate what immersive sound is or how important it is, they instantly recognize when they hear something that makes them feel like they are really inside an environment.

Greg: We provide those components that complete the whole sensory connection between the visual and auditory. When that is completed the experience becomes believable, real, and engaging.

Spatial audio is achieved by replicating the way our ears hear sound. Imagine someone trying to record a stereoscopic static video. They would set up two camera lenses in close proximity to our eyes so that they can see something through the lenses the same way that we do. We take that same approach to recording sound. It’s really important to capture sound with a pair of microphones from a perspective that replicates how our ears hear that sound.

Hear360_Team Photo
Matt Marrin, Greg Morgenstein and Saul Laufer, Hear360

3. What are the other ways of replicating 3D sound?

Greg: We set out to create something that captures audio with a very natural sounding result. The outcome is that you feel like you’re really there. Our current approach represents how the human ears work together in capturing scenarios. Another way of capturing spatial audio or creating a spatial audio experience is the synthesis of it. This is where you capture a sound signal and create an experience in post-based on that sound field, similar to ambisonic capture or ambisonic deliverables. It’s another method that a lot of companies in this industry have adopted. We’ve created a lot of those tools and built a ton of prototypes of those tools along the way, but we think that our first products should be really natural sounding, so we decided to start by pushing the binaural experience. We believe binaural has a very important place in VR and immersive content.


4. Explain the difference between binaural and ambisonic.

Matt: We can separate spatial audio capture into two main categories: binaural capture and delivery, versus ambisonic capture and delivery. One difference between the two formats is that ambisonic recordings require synthesis to be spatialized – the audio needs to be converted to B-Format and then fed through HRTF filters in order for a listener to hear the spatial component. The problem here is that you can’t control how a platform’s HRTF filters sound – meaning that you lose control of how your recordings sound at the delivery stage. There are other problems with ambisonic recording including phase and frequency response issues that are inherent in the de-code process. The reason why we love our omni-binaural recording solution is because it is not synthesized. All the spatial audio cues are baked into the original recording in high resolution. Coming from a background of over 20 years of recording, mixing, playing, and producing music, as an audio engineer you’re drawn to capturing something, hearing it the way you captured it and understanding how it sounds. We are obsessed with resolution and how we record things. We’re fascinated with our playback monitors, and when we mix we listen in all sorts of different environments including our cars, homes, laptops, etc. We do that for one reason – because we want to make sure our mix sounds great wherever it’s played and we want to have a firm understanding of that.


5. What was the motivation or inspiration for developing a spatial microphone? Did it have to do with creative challenges or developments in 360° or VR?

Greg: We wanted to have an end-to-end solution where we control the quality of it from capture to delivery. We felt like there was a place for an omni-binaural capture because we wanted to have something that sounded very natural and sounded like the space and the performance that was played in the initial environment. We liked that you could capture sound that way, without the need to change it or affect it if you didn’t want to. So we sought to develop a microphone that could accomplish that. From there, we had to create all of the tools to edit the captured content, mix it with non-spatial audio, or things that you wanted to become spatial, and finally deliver it encoded for your preferred content platform.

Matt: Before the 360° video and VR world evolved, another big motivation for us was how cool binaural audio is. We had been working in a space where static binaural was something that had been around for years, but not a lot of people had experienced it. When you hear it, it really changes the way you think about experiencing recorded sound. When you realize you can experience recorded sound in three dimensions, it’s kind of mind-blowing. Now there’s an opportunity where you can interact and move around in a sound environment, making the experience even more believable. When VR and 360° video came into play, immersive audio was a neglected area. At the time, it was more of an idea than anything. For us, it was all about the excitement of delivering something new and sharing that excitement with other people.

IMG_6878.jpg
Vic Mensa, AT&T and Direct TV

6. Technically, how does the 8ball microphone work when capturing sound? Is it easy to use when it comes to production? Who should buy the 8ball microphone — who is your ideal customer?

Greg: The technical aspect of how it works is that it basically mimics four human heads facing in four opposing directions – front, left, rear and right. In every human head, we have two ears, a pinna and we have acoustical shadowing based off of the curvature of the head. Those are the basic aspects of a capture. When your ears are separate, you have a time delay between their capture and when you introduce a pinna you have filtering or tone colourization based on directionality towards a pinna on the front or the rear. When you combine all these things – including the curvature of the sphere, acoustic shadowing, time delay level differences based on the directionality of the oncoming signal and omni-directional capture – you get a spatial capture. When you do it in every direction you can head-track that capture. Since it’s based off the human head in a standard binaural capture, you get a very natural sound.

When it comes to production, we designed 8ball to be point and shoot. There are eight channels that you’re capturing. You’ll need a multi-channel audio recorder – Sound Devices, Zoom or any other hardware device that records eight channels of audio will work – and then you just point and shoot. We’ve designed all of our post tools to make it easy to manipulate it if you’ve captured something wrong. In that case, you can recalibrate with our calibration tools. If you dont want to do anything in post, you can just multiplexer (MUX) it with video with our encode tool and deliver to whatever platform you see fit. You can also convert it to an ambisonic deliverable for Facebook and YouTube.

Ideally, the people who purchase 8ball use it for a lot of music related capturing, reporting, documentary, cinematic, or live streaming because of how natural it sounds. It’s simple to use and streamlined, so it doesn’t require a lot of encoding or manipulation. Another good example is capturing natural background environments such as a park, jungle or street for VR gaming experiences and 360° experiences. It’s eerie how real it sounds and it’s great to put that into a virtual experience.

8Ball.JPG
HEAR360 8ball

7. Do you need special software to decode captured audio from an 8ball microphone? Do your tools integrate with traditional audio post-production tools?

Greg: Yes, it’s all very easy to use. You don’t have to learn anything new to mix and integrate with traditional audio post-production tools. We built our workflow into all the standard digital audio workstation (DAW) systems. The first one we supported was Pro Tools because that’s basically the industry standard for mixing and editing. A lot of people use Reaper because it’s very flexible, so we created virtual studio technology (VST) plugins. These tools can be utilized in other DAW systems as well. We are working on creating some limited tools for video editors like Premiere and Final Cut Pro, where they can take an 8ball capture and put it into these systems, calibrate to capture, edit, encode, and deliver it.


8. What about playback of spatial audio, why not focus on this as well? Wouldn’t the ultimate experience of immersive sound be through a high-end amplifier or expensive speakers? What is the best way to hear spatial audio?

Greg: Most people experience VR content in a head-mounted display (HMD) and every HMD requires a headphone component to it. So that’s the first thing we wanted to support. When you’re wearing headphones, you’re separating your left ear from your right ear. During a spatial audio playback, it’s easier to separate them because you don’t have crosstalk. For over speakers, we’ve created tools that include crosstalk cancellation – so when you play information from the left speaker your right ear won’t pick it up, and vice versa. You can steer that information and work with delay to manipulate things in order to mimic a headphone experience and create a compelling spatial audio experience. It’s something that we’ve been working on since we started the company, but we don’t yet see a large use of it in the industry. Most people are wearing headphones, so we are focusing on headphones first. However, we have created tools for beamforming, crosstalk cancellation, and tools to deploy a spatial audio experience over speakers for future products.

8ball_Madrid_Open_2017.jpg
The 8ball in action: capturing 360 audio at the Madrid Open

9. Can you live-stream spatial audio?

Matt: Yes, but only if you build very specific tools, which we have. In order to do this, we built our own streaming server to live-stream multi-channel audio. We set it up so that our web player could render live 4K video with head-trackable spatial audio over LTE.

There are a few people who are attempting to live-stream with spatial audio, however, it’s a pretty difficult process to pull off. It’s not as simple as what we’ve created – our solution is more point and shoot. Our solution also gives you the ability to scale up into an advanced live broadcast and the ability to MUX non-spatial content with spatial captures. We’ve created all the tools to allow engineers to do this in real-time, with 4K video and head-tracking over our servers.


10. What creative possibilities do you see in the future of spatial audio?

Matt: A wider range of flexibility will happen over mobile and the delivery format will open up so you can experience VR in your home, or watch it on television more easily with fully interactive sound. Also, live experiences with spatial audio like car racing, horse racing, and other sporting events will become more readily available and expected by consumers who will demand increasing levels of interactivity with the content they consume.

Spatial audio is very important for live events as it completes the transformation of the user feeling like they are truly inside the experience. The difference between headphones and speakers will start to blur and you’ll be able to experience spatial qualities on any playback system. Augmented Reality (AR), increased computing power, personalized HRTFs, and object-based spatial audio solutions will allow us to be truly interactive with sound through normal everyday activities.


 

The Future of Sound

An interview with Dave Sorbara, Partner and Chief Creative Officer

 


1. Over the past 17 years, Grayson Matthews has specialized in music and sound for film, television and radio. Recently you started exploring new mediums, primarily around the creation of immersive content. What sparked your interest in immersive media and spatial audio?

Back in the late 1990s, we were in recording studios with giant consoles and tape machines that were very expensive – it was the only place where you could do professional recordings. Then the computer came along, and I remember being really fascinated with what you can do with a computer versus an analog console. I was interested in the ability of computers and technology to do a lot of things that analog technology in recording studios had been doing. That was a thread in our existence that always led us to the next transition in terms of the evolution of our company; it was really about technology.

As we progressed, I discovered spatial audio and virtual reality (VR) and what was happening behind the scenes with a lot of tech-heavy audio people, who are really more mathematicians than audio people. It’s a whole new way to think about sound and requires a new set of technologies. Suddenly, the possibilities of what you can do with sound began to open up again and I started getting the same feelings in my bones about sound. It was all about solving new problems and hearing things you never heard before. I felt reinvigorated in thinking that this was another wave of interest that can drive us forward over the next decade or so.


2. How is immersive content changing the way we tell stories and experience audio?

Immersive content is all about elevating the engagement with the individual audience through interactive stories. It’s that idea that you want to pull the audience into the story more so than having them sit there and watch the narrative. For example, my four-year-old son loves watching videos on YouTube, however his biggest obsession isn’t watching the video itself, it’s being able to change very quickly from video to video – he won’t watch the whole thing. At one point, I showed him a 360° video and once he realized you can pan the camera around and engage with a piece of content, static video in comparison was boring.

For younger people who have grown up in the age of the iPhone and the Internet, being engaged with content more so than being a passive player is really important. A friend of mine’s son, who is 12 years old and loves hockey, would never sit and watch hockey on television. He would rather play hockey in a hundred different forms such as hockey video games, street hockey, fantasy league hockey, etc. You can engage with the love of that sport, instead of passively watching a Maple Leafs game on TV.


3. What are some of the best immersive experiences you can think of to date so far? Who’s doing this well?

I think the best example of immersive content today is mixed reality experiences where you’re combining physical objects in a virtual free roam space. Toronto companies such as The Void, Globacore and one of our partners Seed Interactive do this very well. It is remarkable how you can transport your mind and body to a different planet via these different technologies. To feel like you’ve been transported instantly and have your brain fully believe it, is pretty amazing.

Mixed reality (MR) experiences are definitely cutting-edge – there’s not a lot of people doing them. We are in very early days, but there are some great VR games out there. One that I love playing is SUPERHOT, which is an amazing example of such a simple concept that only works inside of an immersive experience. Another huge one for me is music, because it is completely different in the context of immersive content. What I believe is the most important part about music these days is music performance. As computers advance in music production, live music will be crucial to its longevity in existence and in popularity. Capturing a music performance using 3D capture technology for audio and video and making someone feel like they are actually there is one of the best immersive experiences. There’s such a different emotional connection you make to the music in a performance inside of a headset or in a connected immersive piece of content.


4. What VR/AR gear would you recommend someone who is new to the immersive world?

At this moment in time, I would hesitate to buy anything VR related that connects to a large computer. Vive’s next iteration will be coming out this year and a lot of the gear is moving away from being tethered to computers. The next generation consists of stand-alone, no computer required, high frame-rate and high-resolution headsets that will be able to deliver what we’re delivering today on an Oculus connected to a powerful computer. The only difference is that there is ease of portability and use and it requires less setup.

In terms of augmented reality (AR), if you haven’t experienced what you’re capable of doing now with your AR tool kit on the iPhone, that’s the first place to dig into because there’s so much showing up there. Check out the IKEA Place AR app and you’ll get a feel for where this medium can go. It’s a great way to get people familiar with the ideas of what the possibilities are. Also, AR headsets will start making their way into the marketplace. Microsoft is coming out with a new iteration of their HoloLens that’ll be high-resolution. And the super secretive company, Magic Leap has finally shown off their first MR headset.


5. Tell us about your experience designing 360° soundscapes for SESQUI’S spatial dome?

I have to say that designing 360° soundscapes is interesting because you take all the learning you’ve done for stereo and surround and you think you can apply it, but then you realize it’s totally irrelevant and you have to start again from the beginning. The amazing thing about the process of starting fresh is going back to that creative instinctive of just using your ears, your feeling and your heart and ask yourself, “Does this sound cool? Is this interesting?” Surprisingly, you have this infinite number of creative possibilities in a 360° space. You’re creating environments and coming up with ideas on the spot and you want to experiment with all of it because you have no idea whether they’re going to work or whether it’s right or wrong. What ends up happening is you go back to that instinctive child in you who’s just playing to find something. In a way, that’s what a lot of the SESQUI project was really about.

The most eye-opening thing about creating sound for these type of experiences is that you have to throw convention out of the window and start from scratch.

Starting from scratch is a beautiful thing because it’s freeing and scary. You thought you knew something and suddenly, now you don’t. It was an amazing experience for that.


6. What can we expect to see on Grayson’s mobile application?

What we’re trying to do is push the possibilities of music and sound. This is an application where we can show people where it can go. We’re limited a lot of the time by the mediums with which we’re sharing content, so we’ve developed an app that will demonstrate the most cutting-edge version of what we’re trying to do. We aim to find a natural place where sound can go that resonates the way music does emotionally.


7. What are some challenges you have faced in building and creating immersive music experiences artificially?

The biggest challenge is creating the feeling of what were trying to achieve for the end users. It’s a technical challenge because a lot of the 3D spatial audio that we hear today is a synthesized mathematical equation and it makes sense and it works, but it’s missing that natural sound that connects with the individual. Although it localizes sound, it doesn’t sound real.

For example, HEAR 360’s 8ball microphone tries to capture that feeling versus an ambisonic microphone which captures a mathematical version of what it’s hearing. I think the next frontier is getting people to emotionally connect to sound like they do to a piece of music. We’ve had 75 years of working with technology and understanding it in order to get to a place to make people feel something when they hear it.


8. Why is it crucial to run tests before integrating 360° audio into live-streams?

The tests are important because there’s no precedent for what you’re going to hear and what it’s going to sound like. Since everything is new, almost every project is an R&D experiment where you have to solve more problems. The technology we’re using is new and always changing so every time you deploy new gear you have to understand what you’re going to get out of that gear. Testing and prototyping is a huge part of the process because there’s so many variables. You need to do the R&D and testing to ensure you are going to get something usable when you are shooting.


9. What are your greatest findings in experimenting with immersive technologies?

The scary thing that I’ve learned is immersive technologies can transport human beings pretty easily. Realising how easy it is to take someone’s brain and convince them that they are doing something else, are somewhere else or feeling something, is pretty powerful. It has the ability to educate exponentially quicker than other methods of technology and it can make things feel more real than anything else. Recognizing how powerful that is, you know it’s here to stay, it just needs to be taken seriously.


10. Where do you think it’s all going?

We live in this experience economy, where experiences are the most important things to humans. The ability for new technologies to make people experience things they typically wouldn’t have access to or have not felt before is becoming more evident. Education is going to change drastically. For example, you can take a nine month technical training course and shrink it down to two weeks because of technology like this. Entertainment is going to play a large role in the immersive world. In the near future, you’ll be able to sit courtside and watch a basketball game while looking at six degrees of freedom cameras, which will give you the ability to walk and move through the space. AR has given us the ability to overlay information and content and integrate it into real space. For example, AR tape measures, which can be downloaded in the App store, are a game changer. Businesses that mass produce tape measures are now competing with AR technology because an app gives everyone the convenience of virtually carrying around a tape measure in their pocket. The AR tape measure apps are more powerful, efficient and have more features than a physical tape measure. These types of advancements are constantly going to happen in the near future.


 

8ball: Sound Incredible

8ball, a world-class spatial microphone is now available.


The 8ball is a patent-pending omni-binaural microphone that comes with companion software to provide a seamless workflow from on-set recording to content delivery. 8ball delivers truly immersive spatial audio, and sounds incredible.

It requires 48v phantom power and works with any 8-channel field recorder. The microphone comes with a premium 6’ Mogami multi-channel snake which terminates in micro-XLR on the microphone-end and standard male-XLR output connectors.

The 8ball and accompanying core-suite software workflow has been designed and developed through an extensive beta programme with VR filmmakers and content producers from across the audio and film communities. The applicational opportunities for this microphone have continually been presenting themselves and yielding excellent results across mediums and disciplines.


+ Easy To Use

The 8ball has an easy release, self-centering mount – a simple clamp fastens 8ball to the camera stand and releases quickly making set-up a snap. It fits with a wide range of tripod sizes, mounts out of view of the camera, but centers on the camera’s point of view to accurately capture audio.


+ Seamless

The 8ball companion software is a full end-to-end workflow solution that allows you to deliver to multiple channels. Recordings from the 8ball integrate seamlessly into common post-production environments via AAX, VST, AU and Unity Plugins. The 8ball produces non-synthesized spatial sound in a format that works natively with Samsung Gear VR, Unity, and the HEAR360 iOS app and web player. With Hear360’s conversion tools you can adapt mixes easily for publishing to YouTube, Facebook, or any other FOA platform.


+ Holographic

With the 8ball, you can capture the world in a new way. Omni-binaural recordings capture the nuance and detail of our three-dimensional world providing content creators an uncompromising sound field to sculpt from.  Combined with Hear360’s core-suite of software and head-tracking, we’ve created an opportunity for creators to develop an immersive audio experience that is nearly indistinguishable from reality.


+ Demos

We’d be happy to give you a demo and show you what the 8ball can do. It is a world-class microphone and the sound it delivers is incredible, on any type of production. In the meantime, pop on a good pair of headphones and have a listen:

  • Hear what it’s like to be in the middle of the audience at a big music festival. This is Major Lazer with 360 spatial audio recorded with the 8ball, a lightweight field recording solution with exceptional results – no board mix, additional sweetening or effects added. (Click to play)
  • Listen to country star Jason Aldean perform at the iHeartCountry Festival in Austin, Texas. This is a great example of a stadium mix being merged with spatial audio captured with the 8ball. (Click to play)

 


Already convinced and want to pre-order? Go here and use the discount code GRAYSONX to get an additional $50 off the price.