Thoughts on the future of M&E: A wrap-up of interesting observations from SMPTE

The EMC Media and Entertainment team recently returned from the 2015 SMPTE (Society of Motion Picture & Television Engineers) conference in Los Angeles. It was four jam-packed days of technical presentations, and we all came back with a greater understanding of the future of media workflowsMedia and thoughts on how we can help our customers take those next steps.

From the agenda we expected a lot of conversations around the transition from Virtual/Augmented Reality, SDI to IP, Cloud and Hybrid Cloud, and future delivery methods. We weren’t disappointed; we had some great conversations with people throughout the industry. Here’s an overview:

Tom Burns – CTO M&E

At this year’s SMPTE Annual Technical Conference, I attended the “Broadcast Infrastructure” track, even though I really wanted to see what was happening with High Dynamic Range (Psst – I found out that Cinematic HDR is a reality at a few AMC Prime venues! Wait for a full subjective review in a following post…)

The most exciting infrastructure trend I encountered (detailed via a number of papers and presentations at the SMPTE ATC) is the ability to replicate the real-time capability of Serial Digital Interface (SDI) video over coax within an all-IP plant. This feat is accomplished via IP encapsulation using an unmodified group of switches in an Ethernet fabric (often in a leaf & spine topology).

Sending high-quality video via a point-to-point IP connection has been around for a while, at a range of prices, quality settings and codecs. However, the last technical hurdle is to provide frame-accurate switching of an SD, HD, or UHD video signal, with embedded audio, timecode, genlock and other ancillary data.

The IEEE 1588 Precision Time Protocol (PTP) was presented for our enjoyment, with the addition of SMPTE extensions to become ST-2059, a protocol for genlock over IP networks. As well, a new slate of SMPTE standards for video transport over IP networks were detailed, the ST2022 family (parts 1 – 7).

Charles Sevior – CTO APJ

Cloud
The “Cloud” is really starting to change the way Broadcast IT technology teams think about storage and compute infrastructure and the distribution of rich media content for both B2B and B2C requirements.  I spent a full day in the SMPTE TC Cloud track, consuming and questioning speakers from AWS, Sundog Media, ETC (Entertainment Technology Center) of USC, Telestream, and Levels Beyond.

There are plenty of advocates from the cloud industry pointing to global and cost-effective solutions, consuming resources on demand, and so on.  It certainly seems no doubt that in a few years the dedicated racks of carefully constructed equipment and software stacks powering most media companies will be replaced with general-purpose technology, operating systems, and application stacks.

I personally think these application stacks will tend towards a hybrid between on-premise and off-premise infrastructure – with the off-premise perhaps also a hybrid between private specialist hosting and public service providers.  That decision is primarily guided by cost and expertise – which is in turn guided by the service provider’s “value-add” in terms of high-speed connectivity to the content provider or recipient (I am thinking here of the difference between multiple uncompressed HD/UHD feeds from sports venues and delivery to a viewer’s device – broadcast or unicast).

Ultimately as we get universally adopted cloud stack frameworks, it will not be difficult to spin these up and down on different platforms in different environments.  This work is progressing well and I am pleased to see that EMC is well-positioned to deliver this technology – whether it be cloud storage, cloud computing or open-source frameworks to provide resources to third-party vendor application solutions.  Stay tuned for more announcements on this in 2016!

Anyone who brings a digital file-based workflow solution into a dynamic media organization – such as a live Newsroom – knows that the hardest problem to solve is the file naming convention.  Most facilities have their own bespoke solutions.  It doesn’t really matter as long as it is documented and everybody follows the rules! However as our file counts grow into the billions, and we have automatic conform and transcode processes constantly creating new files, we know that file-naming remains a big problem – one that SMPTE has been working to solve and standardize.

One very cool concept that was presented (and still has my head spinning) was from Joshua Kolden, ETC@USC. He presented a solution more advanced than the UMID or MD5 hash, which produces a unique 90-character human and machine readable code for every single file on the planet.  The short summary is below, and the link to the paper – recommended reading!

The C4 ID system provides an unambiguous, universally unique ID for any file or block of data. However, not only is the C4 ID universally unique, it is also universally consistent. This means that given identical files in two different organizations, both organizations would independently agree on the C4 ID, without the need for a central registry or any other shared information.” – ETC@USC

OTT
Delivery of content directly to consumers “Over-The-Top” of the Internet is what we all know these days when we watch media on YouTube, Facebook, Netflix or any of the myriad of platforms.  It is of course one of the biggest consumption growth patterns that our industry is tracking, and every traditional broadcaster is actively making content available via OTT platforms.  It is both a threat and an opportunity, and is a major disruption to what has been a pretty stable advertising- and subscription-funded business model that has endured over the past decades.

There were three thought-provoking sessions covering what SMPTE described as “the wild west.” Prime Focus Technologies – the India-based media platform solution provider – presented a dynamic metadata tagging solution for live sports content creation that dramatically increased the “speed to screen” from live event to mobile catch-up consumption.  Comcast spent some time delving into the real-time packaging and repurposing of linear content for OTT distribution and consumption, including Just-In-Time packaging and dynamic Ad-Insertion (Server-side vs. Client-side).  Everything has to be just right in order to get a good viewer experience with no buffering and pauses. The final session was a student paper from USC.  Actually this student Arnav Mendiratta was also honored by SMPTE as the 2015 recipient of the Louis F. Wolf Jr. Memorial Scholarship. He explored the application and benefits of Big Data Analytics (such as the Hadoop ecosystem) to improve viewer satisfaction and increase monetization.

VR
The pre-symposium conference track was dedicated to the emerging technology and consumer category of Virtual Reality / Augmented Reality.  You will be familiar with this as typified when somebody straps on a viewing headset and enjoys a role-playing game.  Whilst currently in the realm of gaming, this may extend into movies and television as the logical progression beyond stereoscopic (3D) viewing technology.  It is extraordinary to contemplate the storage, computing, and bandwidth issues when you consider each “camera” is now a 360 degree dome with between 12 and 30 HD/UHD cameras running at a high frame rate (> 50 fps). 3D computer stitching of all cameras for every frame creates a high-resolution 360 degree “canvas.” Unicast delivery of this to every viewer – each free to choose their angle of view and direction in real time—means extremely high data rates and storage requirements.  These are problems that will take some time to become commercially viable.

However as my thoughts turned to what the experience would be like enjoying my favorite sports event from a virtual seat hovering close to the on-field umpire, my immediate concern was: how do I reach my beer and drink it whilst wearing a headset and not spill a drop?  Such are the really serious nature of these practical concerns. (I discovered that those on the inside are actually working on this problem by designing a “beer caddy” with the electronic visibility of a game controller).  Maybe VR technology does have a future!

Media and Entertainment is such an important aspect of our lives, and the technology to create, deliver and archive media continues to drive towards ever more efficient workflow. It’s clear that the media industry continues to evolve, as these are just a few of the technologies that are transforming our industry today.

Charles Sevior

About the Author: Charles Sevior

Charles Sevior is CTO for the Unstructured Data Solutions Division of Dell Technologies. With a strong background in the media sector, he also provides focus on solutions for Automotive, AI, Semiconductors, Smart Cities and other sectors based in the Asia Pacific region. Charles has 35+ years of professional engineering experience. Prior to joining Dell he was Technology Director for leading media company Nine Entertainment Co. Australia. He has also held positions of Director on the boards of several private and public companies. Charles is working with customers to help define their next generation business and technology digital transformation – covering scale-out File and Object storage, multi-cloud and ML/DL for “useful AI business outcomes”. He has attended and presented at many industry-focused conferences and prefers a consultative approach favouring collaborative solutions with leading application partner vendors to yield excellent results for Dell customers. Charles Sevior holds a Bachelor of Engineering (Hons) degree from the University of Melbourne, and a Master of Business and Technology (MBT) from the University of NSW / AGSM in Sydney.