Unit-VIII: Multimedia and Digital Video
Sakshi Education
Key Multimedia Concepts
Multimedia: The use of digital data in more than one format, such as the combination of text, audio and image data in a computer file. The theory behind multimedia is digitizing traditional media like words, sounds, motion and mixing them together with elements of database.
Multimedia data compression:
Data compression attempts to pack as much information as possible into a given amount of storage. The range of compression is 2:1 to 200:1.
Compression Methods:
Data compression in action:
Data compression works by eliminating redundancy. In general, a block of text data containing 1000 bits may have an underlying information content of 100 bits, remaining is the white space. The goal of compression is to make the size of the 1000-bit to 100-bit (size of underlying information).this is also applicable to audio and video files also.
Compression Techniques:
Compression techniques can be divided into two major categories:
Lossy:
Lossy compression is a means in which the given a set of data will undergo a loss of accuracy or resolution after a cycle of compression and decompression. It is mainly used for voice, audio and video data. The two popular standards for lossy compression technique are: MPEG and JPEG. This usually happens when the data is intended to be transmitted across a medium.
Lossless:
Lossless compression produces compressed output that is same as the input. It is mainly used for text and numerical data.
Multimedia Server:
A server is h/w & s/w systems that transforms raw data into usable information and provide to users when needed. E-commerce application will require a server to manage application tasks, storage, security, transaction management and scalability.
Multiprocessing:
Multiprocessing can be thought of as a process of executing several tasks on multiple processors. This implies that the ability to use more than one CPU for executing programs. The processors can be tightly or loosely coupled.
Symmetric multiprocessing:
Symmetric multiprocessing treats all processors as equal i.e. any processor can do the work of any other processor. It dynamically assigns work to any processor. Here, operating systems turns out to be crucial.
Multitasking:
Multitasking means that the server operating systems can run multiple programs and give the illustration that they are running simultaneously by switching control between them.
Two types of multitasking are:
Multithreading:
Multithreading is a sophisticated form of multitasking and refer to the ability to support separate paths of execution within a single address space. In this a process broken into independent executable tasks called threads
Multimedia Storage Technology
Storage technology is becoming a key player in electronic commerce because the storage requirements of modern-day information are enormous.
Storage technology can be divided into two types:
Disk arrays:
Disk arrays store enormous amounts of information and are becoming an important storage technologies for firewall servers and large servers. Range provided for small arrays is 5-10 gigabytes. Range provided for large arrays is 50-500 gigabytes. Technology behind disk array is RAID (redundant array of inexpensive disk). RAID offers a high degree of data capacity, availability, and redundancy. Current RAIDs use multiple 51/2 –inch disks.
CD-ROM:
CD-ROM is premiere desktop stop storage. It is a read only memory, to read CD-ROM a special drive CD-ROM drive is required. The main advantage is the incredible storage density. That allows a single CD-ROM disc contains 530MB for audio CD. CD-ROM technology exhibits the following:
High information density: It is with optical encoding, the CD can contain some 600-800 MB of data.
Low unit cost: Unit cost in large quantities is less than two dollars, because CDs are manufactured by well-developed process.
Read only memory: CD-ROM is read only memory so it cannot be written or erased.
Modest random access performance: Performance of the CDs is better than floppies because of optical encoding methods.
Digital Video and Electronic Commerce
Digital video is binary data that represents a sequence of frames, each representing one image. The frames must be shown at about 30 frames per sec. The frames being ran at a speed of 30 frames per second leaves an impression that it is a video.
Characteristics of Digital Video:
Digital video compression/decompression:
Digital video compression takes the advantage of the fact that a substantial amount of redundancies exist in video. The hour-longer video that would require 100 CDs would only require one CD, if video is compressed.
The process of compression & decompression is commonly referred to as just compression, but it involves both the processes. Decompression is inextensible because once compressed a digital video can be stored and decompressed many times. The adaptations of international standards are called codec. Mostly used codec are loss compression one.
Types of Codec's:
Most codec schemes can be categorized into two types:
Hybrid: Hybrid codec use combination of dedicated processors and software. It requires specialized add-on hardware.
Best examples of hybrid codec are MPEG (moving picture expert group) JPEG (joint photographic expert group)
MPEG (moving picture expert group):
Moving Picture Expert Group is an ISO group; the purpose of this is to generate high quality compression of digital videos.
MPEG I (Moving Picture Expert Group I):
MPEG I defines a bit steam for compressed video and audio optimized to a bandwidth of 1.5 Mbps, it is the data rate of audio CDs & DATs. The standard consists of three parts audio, video, and systems. A system allows the synchronization of video & audio. MPEG I implemented in commercial chips. The resolution of the frames in MPEG I is 352X240 pixels at 30 frames per second. The video compression ratio for this is 26:1
MPEG II (Moving Picture Expert Group II):
MPEG II specifies compression signals for broadcast-quality video. It defines a bit steam for high-quality “entertainment-level” digital video. MPEG-2 supports transmission range of about 2-15 Mbps over cable, satellite and other transmission channels. The standard consists of three parts audio, video, and systems. A system allows the synchronization of video & audio. MPEG II implemented in commercial chips.
The resolution of the frames in MPEG I is 720X480 pixels at 60 frames per second. A data rate of the MPEG-2 is 4 to 8 Mbps. Future promising of this is rapid evolution of cable TV’s news channels. Two other MPEG standards are
JPEG (Joint Photographic Expert Group):
JPEG is a still-image compression algorithm defined by the joint photographic expert group and serves as the foundation for digital video.
JPEG is used in two ways in digital video world:
This standard has been widely adopted for video sequences. JPEG compression is fast and can capture full-screen, full-rate video. It was designed for compressing either full-color or gray-scale Digital images of real-world scenes. It is a highly sophisticated technique that uses three steps:
Desktop Video Processing
Video on the desktop is a key element in turning a computer into a true multimedia platform. PC has steadily become a highly suitable platform for video. Desktop Video Processing includes upgrade kits, sound cards, video playback accelerator board, video capture hardware and editing software. Microphones, speakers, joystick, and other peripherals are also needed.
Desktop video hardware for playback and capture:
Desktop video require a substantial amounts of disk space and considerable CPU horse-power. It also requires specialized hardware to digitize and compress the incoming analog signal from video tapes. The two lines of video playback products become available in the marketplace I.e. video ASIC chips and board level products.
Video playback:
The two lines of video playback products become available in the marketplace i.e., video ASIC chips and board level products. Broadly speaking, two types of accelerator boards are available:
Video capture and editing:
Video capture board is essential for digitizing incoming video for use in multimedia presentations or video conferencing. Video capture program also include video-editing functions that allows users crop, resize and converts formats and add special effects for both audio and video like fade-in, Embosses, zoom and echo's. Developers are crating next generation editing tools to meet business presenters and video enthusiasts. The best graphical editing tools make complex procedures accessible even to novice users.
Desktop video application software:
Any PC to handle digital video must have a digital-video engine available. Two significant digital video engines are:
These two are software's only; they don’t need any special hardware.
Apple’s QuickTime:
QuickTime is a set of software programs from apple that allows the operating system to play motion video sequences on a PC without specialized hardware. QuickTime has its own set of compression/decompression drivers. Apple’s QuickTime was the first widely available desktop video technology to treat video as a standard data type. In this video data could not be cut, copied, and pasted like text in a composition program. Apple’s QuickTime movie can have multiple sound tracks and multiple video tracks. Apple’s QuickTime engine also supports synchronize.
Microsoft’s video for windows:
Microsoft’s video for windows is a set of software programs from Microsoft that allows the operating system to play motion video sequences on a PC without specialized hardware. Microsoft video for windows has its own set of compression/decompression drivers. Microsoft chooses a frame-based model, in contrast to QuickTime-based model.
Desktop video conferencing
Desktop video conferencing is gaining momentum as a communication tool. Face-to-face video conferences are already a common practice, allowing distant colleagues to communicate without the expense and inconvenience of traveling.
Early video conferencing utilized costly equipment to provide room-based conferencing, but now it becoming fast due to desktop video conferencing in this we participate by sitting at our own desks, in our own offices, and call up others using their PCs much like telephone.
Types of desktop video conferencing:
Desktop video conferencing system in the today’s market is divided into three types they are based on plain old telephone lines:
Using POST for video conferencing:
POST systems are especially attractive for Point-to-Point conferencing because no additional monthly charges are assessed and special arrangements with the telephone company are unnecessary.
The drawback with a POST solution is a restriction to the top speed of today’s modems of 28.8 Kbps. It need a s/w. Once proper installation of s/w os is done, then the users allows to pipeline video, audio, and data down a standard telephone line.
Using ISDN for video conferencing:
ISDN lines mostly offer considerable more bandwidth up to 128 Kbps, but it requires the installation of special hardware. The use of ISDN has been restricted to companies especially in private residence. The following fig explains the basic architecture for television or video conferencing using ISDN network transport switching.
This architecture is commonly found in videophones. Networks required for video conferencing are fiber optic cable or analog POST. For video compression and decompression, the ISDN networks uses the H.261 technology, it is specified by the international telegraph and telephone consultative committee algorithm.
ISDN video or teleconferencing architecture
Using the Internet for Video Conferencing:
The two video conferencing programs are available on the internet:
CU- See Me:
CU- See Me is the first software available for the Macintosh to support real-time multiparty video conferencing on the internet. CU- See Me provides a one-to-one, one-to-many, several-to- several and several-to-many conferencing depending on the user needs with minimal cost.
MBONE:
It is a virtual network built on top of the Internet, Invented by Van Jacobson, Steve Dearing and Stephen Caner in 1992. The purpose of MBONE is to minimize the amount of data required for multipoint audio/video-conferencing. MBONE is free; it uses a network of m routers that can support IP Multicast. It enables access to real-time interactive multimedia on the Internet. MBONE uses a small subset of the class D IP address space (224.0.0.0 - 239.255.255.255) assigned for multicast traffic. It uses 224.2.0.0 for multimedia conferencing
Multimedia: The use of digital data in more than one format, such as the combination of text, audio and image data in a computer file. The theory behind multimedia is digitizing traditional media like words, sounds, motion and mixing them together with elements of database.
Multimedia data compression:
Data compression attempts to pack as much information as possible into a given amount of storage. The range of compression is 2:1 to 200:1.
Compression Methods:
- Sector-oriented disk compression (integrated into the operating system, this compression is invisible to end user)
- Backup or archive-oriented compression(Compress file before they are downloaded over telephone lines)
- Graphic & video-oriented compression(Compress graphics & video file before they are downloaded)
- Compression of data being transmitted over low-speed network(tech used in modems, routers)
Data compression in action:
Data compression works by eliminating redundancy. In general, a block of text data containing 1000 bits may have an underlying information content of 100 bits, remaining is the white space. The goal of compression is to make the size of the 1000-bit to 100-bit (size of underlying information).this is also applicable to audio and video files also.
Compression Techniques:
Compression techniques can be divided into two major categories:
Lossy:
Lossy compression is a means in which the given a set of data will undergo a loss of accuracy or resolution after a cycle of compression and decompression. It is mainly used for voice, audio and video data. The two popular standards for lossy compression technique are: MPEG and JPEG. This usually happens when the data is intended to be transmitted across a medium.
Lossless:
Lossless compression produces compressed output that is same as the input. It is mainly used for text and numerical data.
Multimedia Server:
A server is h/w & s/w systems that transforms raw data into usable information and provide to users when needed. E-commerce application will require a server to manage application tasks, storage, security, transaction management and scalability.
Multiprocessing:
Multiprocessing can be thought of as a process of executing several tasks on multiple processors. This implies that the ability to use more than one CPU for executing programs. The processors can be tightly or loosely coupled.
Symmetric multiprocessing:
Symmetric multiprocessing treats all processors as equal i.e. any processor can do the work of any other processor. It dynamically assigns work to any processor. Here, operating systems turns out to be crucial.
Multitasking:
Multitasking means that the server operating systems can run multiple programs and give the illustration that they are running simultaneously by switching control between them.
Two types of multitasking are:
- Preemptive
- Non preemptive
Multithreading:
Multithreading is a sophisticated form of multitasking and refer to the ability to support separate paths of execution within a single address space. In this a process broken into independent executable tasks called threads
Multimedia Storage Technology
Storage technology is becoming a key player in electronic commerce because the storage requirements of modern-day information are enormous.
Storage technology can be divided into two types:
- Network-based (disk arrays)
- Desktop-based (CD-ROM)
Disk arrays:
Disk arrays store enormous amounts of information and are becoming an important storage technologies for firewall servers and large servers. Range provided for small arrays is 5-10 gigabytes. Range provided for large arrays is 50-500 gigabytes. Technology behind disk array is RAID (redundant array of inexpensive disk). RAID offers a high degree of data capacity, availability, and redundancy. Current RAIDs use multiple 51/2 –inch disks.
CD-ROM:
CD-ROM is premiere desktop stop storage. It is a read only memory, to read CD-ROM a special drive CD-ROM drive is required. The main advantage is the incredible storage density. That allows a single CD-ROM disc contains 530MB for audio CD. CD-ROM technology exhibits the following:
High information density: It is with optical encoding, the CD can contain some 600-800 MB of data.
Low unit cost: Unit cost in large quantities is less than two dollars, because CDs are manufactured by well-developed process.
Read only memory: CD-ROM is read only memory so it cannot be written or erased.
Modest random access performance: Performance of the CDs is better than floppies because of optical encoding methods.
Digital Video and Electronic Commerce
Digital video is binary data that represents a sequence of frames, each representing one image. The frames must be shown at about 30 frames per sec. The frames being ran at a speed of 30 frames per second leaves an impression that it is a video.
Characteristics of Digital Video:
- Several Characteristics of digital video differentiate it from traditional analog video.
- It can be manipulated, transmitted and reproduced with no discernible image generation. It allows more flexible routing packet switching technology.
- Development of digital video compression technology has enabled the new applications in consumer electronics, multimedia computers and communications market.
- It poses interesting technical challenges; they are constant rate and continuous time media instead of text, image, audio and video.
- Compression rate are 10 mb /min of video.
Digital video compression/decompression:
Digital video compression takes the advantage of the fact that a substantial amount of redundancies exist in video. The hour-longer video that would require 100 CDs would only require one CD, if video is compressed.
The process of compression & decompression is commonly referred to as just compression, but it involves both the processes. Decompression is inextensible because once compressed a digital video can be stored and decompressed many times. The adaptations of international standards are called codec. Mostly used codec are loss compression one.
Types of Codec's:
Most codec schemes can be categorized into two types:
- Hybrid
- Software-based.
Hybrid: Hybrid codec use combination of dedicated processors and software. It requires specialized add-on hardware.
Best examples of hybrid codec are MPEG (moving picture expert group) JPEG (joint photographic expert group)
MPEG (moving picture expert group):
Moving Picture Expert Group is an ISO group; the purpose of this is to generate high quality compression of digital videos.
MPEG I (Moving Picture Expert Group I):
MPEG I defines a bit steam for compressed video and audio optimized to a bandwidth of 1.5 Mbps, it is the data rate of audio CDs & DATs. The standard consists of three parts audio, video, and systems. A system allows the synchronization of video & audio. MPEG I implemented in commercial chips. The resolution of the frames in MPEG I is 352X240 pixels at 30 frames per second. The video compression ratio for this is 26:1
MPEG II (Moving Picture Expert Group II):
MPEG II specifies compression signals for broadcast-quality video. It defines a bit steam for high-quality “entertainment-level” digital video. MPEG-2 supports transmission range of about 2-15 Mbps over cable, satellite and other transmission channels. The standard consists of three parts audio, video, and systems. A system allows the synchronization of video & audio. MPEG II implemented in commercial chips.
The resolution of the frames in MPEG I is 720X480 pixels at 60 frames per second. A data rate of the MPEG-2 is 4 to 8 Mbps. Future promising of this is rapid evolution of cable TV’s news channels. Two other MPEG standards are
- MPEG-3(1920X1080 and data rates are 20 to 40)
- MPEG-4(consisting of speech and video synthesis)
JPEG (Joint Photographic Expert Group):
JPEG is a still-image compression algorithm defined by the joint photographic expert group and serves as the foundation for digital video.
JPEG is used in two ways in digital video world:
- As a part of MPEG
- As motion JPEG
This standard has been widely adopted for video sequences. JPEG compression is fast and can capture full-screen, full-rate video. It was designed for compressing either full-color or gray-scale Digital images of real-world scenes. It is a highly sophisticated technique that uses three steps:
- The first step, a technique known as DCT (discrete cosine transformation) is applied.
- Next, a process called quantization manipulates the data and compresses strings of identical pixels by run length encoding method.
- Finally, the image is compressed using a variant of Huffman encoding.
Desktop Video Processing
Video on the desktop is a key element in turning a computer into a true multimedia platform. PC has steadily become a highly suitable platform for video. Desktop Video Processing includes upgrade kits, sound cards, video playback accelerator board, video capture hardware and editing software. Microphones, speakers, joystick, and other peripherals are also needed.
Desktop video hardware for playback and capture:
Desktop video require a substantial amounts of disk space and considerable CPU horse-power. It also requires specialized hardware to digitize and compress the incoming analog signal from video tapes. The two lines of video playback products become available in the marketplace I.e. video ASIC chips and board level products.
Video playback:
The two lines of video playback products become available in the marketplace i.e., video ASIC chips and board level products. Broadly speaking, two types of accelerator boards are available:
- Video
- Graphics
Video capture and editing:
Video capture board is essential for digitizing incoming video for use in multimedia presentations or video conferencing. Video capture program also include video-editing functions that allows users crop, resize and converts formats and add special effects for both audio and video like fade-in, Embosses, zoom and echo's. Developers are crating next generation editing tools to meet business presenters and video enthusiasts. The best graphical editing tools make complex procedures accessible even to novice users.
Desktop video application software:
Any PC to handle digital video must have a digital-video engine available. Two significant digital video engines are:
- Apple’s QuickTime
- Microsoft’s video for windows
These two are software's only; they don’t need any special hardware.
Apple’s QuickTime:
QuickTime is a set of software programs from apple that allows the operating system to play motion video sequences on a PC without specialized hardware. QuickTime has its own set of compression/decompression drivers. Apple’s QuickTime was the first widely available desktop video technology to treat video as a standard data type. In this video data could not be cut, copied, and pasted like text in a composition program. Apple’s QuickTime movie can have multiple sound tracks and multiple video tracks. Apple’s QuickTime engine also supports synchronize.
Microsoft’s video for windows:
Microsoft’s video for windows is a set of software programs from Microsoft that allows the operating system to play motion video sequences on a PC without specialized hardware. Microsoft video for windows has its own set of compression/decompression drivers. Microsoft chooses a frame-based model, in contrast to QuickTime-based model.
Desktop video conferencing
Desktop video conferencing is gaining momentum as a communication tool. Face-to-face video conferences are already a common practice, allowing distant colleagues to communicate without the expense and inconvenience of traveling.
Early video conferencing utilized costly equipment to provide room-based conferencing, but now it becoming fast due to desktop video conferencing in this we participate by sitting at our own desks, in our own offices, and call up others using their PCs much like telephone.
Types of desktop video conferencing:
Desktop video conferencing system in the today’s market is divided into three types they are based on plain old telephone lines:
- POST
- ISDN
- Internet
Using POST for video conferencing:
POST systems are especially attractive for Point-to-Point conferencing because no additional monthly charges are assessed and special arrangements with the telephone company are unnecessary.
The drawback with a POST solution is a restriction to the top speed of today’s modems of 28.8 Kbps. It need a s/w. Once proper installation of s/w os is done, then the users allows to pipeline video, audio, and data down a standard telephone line.
Using ISDN for video conferencing:
ISDN lines mostly offer considerable more bandwidth up to 128 Kbps, but it requires the installation of special hardware. The use of ISDN has been restricted to companies especially in private residence. The following fig explains the basic architecture for television or video conferencing using ISDN network transport switching.
This architecture is commonly found in videophones. Networks required for video conferencing are fiber optic cable or analog POST. For video compression and decompression, the ISDN networks uses the H.261 technology, it is specified by the international telegraph and telephone consultative committee algorithm.
ISDN video or teleconferencing architecture
Using the Internet for Video Conferencing:
The two video conferencing programs are available on the internet:
- CU- See Me
- MBONE
CU- See Me:
CU- See Me is the first software available for the Macintosh to support real-time multiparty video conferencing on the internet. CU- See Me provides a one-to-one, one-to-many, several-to- several and several-to-many conferencing depending on the user needs with minimal cost.
MBONE:
It is a virtual network built on top of the Internet, Invented by Van Jacobson, Steve Dearing and Stephen Caner in 1992. The purpose of MBONE is to minimize the amount of data required for multipoint audio/video-conferencing. MBONE is free; it uses a network of m routers that can support IP Multicast. It enables access to real-time interactive multimedia on the Internet. MBONE uses a small subset of the class D IP address space (224.0.0.0 - 239.255.255.255) assigned for multicast traffic. It uses 224.2.0.0 for multimedia conferencing
Published date : 28 Jul 2015 03:05PM