What are the main screen projection protocols for multi screen interaction?

Multi screen interaction refers to the function of sharing display content on different platform devices through a series of protocol handshake negotiations between different operating systems and hardware devices (mobile phones, PDAs, TVs, etc.) connected through a network, and through a series of operations such as transmitting, parsing, displaying, and controlling multimedia (audio, video, images) content. Simply put, it refers to the screens of several devices that can be connected and converted to each other through specialized devices. For example, movies on mobile phones can be played on TV, images on tablets, and computer content can be shared on TV screens.

The current multi screen interactive protocols mainly include Apple’s Airplay, DLNA, Chromecast, Miracast, WiDi, etc. Here is an overview, and the corresponding content will be explained in detail below

1). Airplay:

Airplay is a private protocol developed by Apple, mainly composed of Airplay music, video, images, and Airplay mirroring functions. Due to the need for encryption and decryption authentication during protocol handshake interaction, it needs to be cracked or directly purchased for authorization (AirPlay2 has been opened to TV manufacturers such as Samsung, LG, Sony, etc.).

2). DLNA:

DLNA was established on June 24, 2003, formerly known as DHWG (Digital Home Working Group), which was initiated by Sony, Intel, Microsoft, and others. The organization aims to solve the interconnection of wireless and wired networks, including personal PCs, consumer electronic devices, mobile devices, etc., and facilitate unrestricted sharing of digital media and content services. But the organization was officially dissolved on January 5, 2017.

3). Chromecast:

Chromecast is a multi screen interactive protocol (private protocol, available in two versions, first and second generation) developed by Google. It wirelessly sends content from small screens (mobile phones, tablets, laptops) to large screen devices (Google TV, Chromecast TV sticks, etc.) for playback, supporting music and video push and mirroring functions.

4): Miracast:

It is a wireless multi screen interaction standard developed by the WiFi Alliance in 2012 based on 802.11 WiFi Direct, which shares music and images through this protocol. The biggest feature of WiFi direct connection is that there is no need for a router during media sharing. Two WiFi connected devices, one serving as an AP role and the other as a station role.

Protocol interaction handshake is based on RTSP, which transmits audio and video TS data streams (some phones may use HDCP encryption, so the receiving end needs to be authorized with HDCP). According to testing, in the Nexus Pixel Android 8.0 system, the mircast source end has been replaced with the chrome cast source end, so it can also be seen that mircast is also a protocol that has been gradually abandoned by Google.