Why did computer video outputs go from digital to analog, then back to digital?Why are a lot of monochrome computer monitors green?Why CPC464 display is less stable while reading from cassette?Why have computers used so many different video connectors?Converting SCART to VGA/JackIs there a device that will take composite video from a vintage computer and filter it so it is green or amber like the old monochrome monitors?Can I build a working(ish) vacuum tube byte?How did the original TRS-80 generate video?Did any machines alternate between two video memory banks?Why did the original Apple //e have two sets of inverse video characters?What is the best option to connect old computer to modern TV
How can I disable a reserved profile?
Which dice game has a board with 9x9 squares that has different colors on the diagonals and midway on some edges?
What is a "staved" town, like in "Staverton"?
Are there any English words pronounced with sounds/syllables that aren't part of the spelling?
Why is there an extra "t" in Lemmatization?
Cargo capacity of a kayak
Strange LED behavior
Host telling me to cancel my booking in exchange for a discount?
If hash functions append the length, why does length extension attack work?
Book in which the "mountain" in the distance was a hole in the flat world
Pass USB 3.0 connection through D-SUB connector
1025th term of the given sequence.
Is it better to deliver many low-value stories or few high-value stories?
Counterexample finite intersection property
Trivial non-dark twist in dark fantasy
Storyboarding Approaches for the Non-Artistic
Why is DC so, so, so Democratic?
How can I deal with someone that wants to kill something that isn't supposed to be killed?
Can I make Ubuntu 18.04 switch between multiple windows of the program by just clicking the icon?
Can "Taking algebraic closure" be made into a functor?
Impact of throwing away fruit waste on a peak > 3200 m above a glacier
What is the best word describing the nature of expiring in a short amount of time, connoting "losing public attention"?
Can a warlock shoot multiple beams from the Eldritch Blast cantrip with only a single free hand?
Why is a PhD thesis typically 150 pages?
Why did computer video outputs go from digital to analog, then back to digital?
Why are a lot of monochrome computer monitors green?Why CPC464 display is less stable while reading from cassette?Why have computers used so many different video connectors?Converting SCART to VGA/JackIs there a device that will take composite video from a vintage computer and filter it so it is green or amber like the old monochrome monitors?Can I build a working(ish) vacuum tube byte?How did the original TRS-80 generate video?Did any machines alternate between two video memory banks?Why did the original Apple //e have two sets of inverse video characters?What is the best option to connect old computer to modern TV
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;
While early microcomputers used analog video outputs (often to use a television as a display), higher end machines such as the BBC Micro or Commodore 128 supported a digital RGB (or RGBI) video output. In the IBM-compatible world, the CGA and EGA display adapters likewise had digital outputs.
When the VGA standard landed in 1987, it was analog, rather than digital. VGA and analog output held sway for well over a decade or two, but then DVI, DisplayPort and HDMI took video outputs back to digital.
Why did video standards abandon digital in the first place, and then return to it several years later?
history video display
add a comment |
While early microcomputers used analog video outputs (often to use a television as a display), higher end machines such as the BBC Micro or Commodore 128 supported a digital RGB (or RGBI) video output. In the IBM-compatible world, the CGA and EGA display adapters likewise had digital outputs.
When the VGA standard landed in 1987, it was analog, rather than digital. VGA and analog output held sway for well over a decade or two, but then DVI, DisplayPort and HDMI took video outputs back to digital.
Why did video standards abandon digital in the first place, and then return to it several years later?
history video display
The answer to "why RGBI in 1981 instead of jumping straight to RGBHV" is probably going to be something boring like, "because memory was expensive and there wasn't a market for >16 colors that justified adding more wires to the monitor cable".
– snips-n-snails
7 hours ago
add a comment |
While early microcomputers used analog video outputs (often to use a television as a display), higher end machines such as the BBC Micro or Commodore 128 supported a digital RGB (or RGBI) video output. In the IBM-compatible world, the CGA and EGA display adapters likewise had digital outputs.
When the VGA standard landed in 1987, it was analog, rather than digital. VGA and analog output held sway for well over a decade or two, but then DVI, DisplayPort and HDMI took video outputs back to digital.
Why did video standards abandon digital in the first place, and then return to it several years later?
history video display
While early microcomputers used analog video outputs (often to use a television as a display), higher end machines such as the BBC Micro or Commodore 128 supported a digital RGB (or RGBI) video output. In the IBM-compatible world, the CGA and EGA display adapters likewise had digital outputs.
When the VGA standard landed in 1987, it was analog, rather than digital. VGA and analog output held sway for well over a decade or two, but then DVI, DisplayPort and HDMI took video outputs back to digital.
Why did video standards abandon digital in the first place, and then return to it several years later?
history video display
history video display
asked 8 hours ago
KazKaz
3,89214 silver badges56 bronze badges
3,89214 silver badges56 bronze badges
The answer to "why RGBI in 1981 instead of jumping straight to RGBHV" is probably going to be something boring like, "because memory was expensive and there wasn't a market for >16 colors that justified adding more wires to the monitor cable".
– snips-n-snails
7 hours ago
add a comment |
The answer to "why RGBI in 1981 instead of jumping straight to RGBHV" is probably going to be something boring like, "because memory was expensive and there wasn't a market for >16 colors that justified adding more wires to the monitor cable".
– snips-n-snails
7 hours ago
The answer to "why RGBI in 1981 instead of jumping straight to RGBHV" is probably going to be something boring like, "because memory was expensive and there wasn't a market for >16 colors that justified adding more wires to the monitor cable".
– snips-n-snails
7 hours ago
The answer to "why RGBI in 1981 instead of jumping straight to RGBHV" is probably going to be something boring like, "because memory was expensive and there wasn't a market for >16 colors that justified adding more wires to the monitor cable".
– snips-n-snails
7 hours ago
add a comment |
5 Answers
5
active
oldest
votes
Early digital video outputs, like CGA, were not really akin to the later standards such as DVI and its follow-on's. The reason for using multiple lines to carry the different analog portions of the signal to the monitor was to prevent crosstalk interference of these signals.
You can see this in the very early computers like the Commodore 64 and Atari 800 separating the luminance and chrominance signals to produce a cleaner display. Similary, CGA (RGBI) separated each color component from the sync components in order to deliver a cleaner version of all those signals to the CRT, minimizing crosstalk. VGA, which was not limited to only 4 bits for representing color information, continued this approach of separate signal lines for color and sync.
Remember, within the CRT, there are only 3 color signals possible for controlling the 3 electron guns that will produce RGB on the display phosphors. So when VGA took the technology from CGA's 2^4 possible colors to VGA's 2^18 possible colors, it was still only necessary to separate the R from the G and the B signals. There would be no point to using 18 separate lines to carry the 18 bits of digital color data to only 3 electron guns in the CRT.
So, the more essential technology change was the move from CRT's, which are analog peripherals, to LCD screens, which incorporate more digital processing in controlling their individual pixel intensities. With LCD's, obviously there are no purely analog electron guns assigned to the primary colors. This fact, coupled with excellent advances in the bandwidth of serial communications channels, allows modern displays to carry the full digital display signal in its computer-native form to the display's processor.
add a comment |
Older monitors had analog timing and, with one notable exception, were designed so that the signals presented on their inputs at any moment would fully describe the color to be displayed at that moment. The only monitors that used any sort of time multiplexing on their inputs were those that used the same analog composite color encoding methods used in broadcast television receivers (and typically used much of the same circuitry to decode the signals).
Older "digital" monitors would need to send video signals over enough wires to identify every possible color as a binary number. For a 64-color monitor like an EGA display, that required using six signals, and for a 262,144-color monitor like the VGA it would have required 18--too many to be practical. Using an analog RGB or YUV (today called "component video") signal reduced the number of signals required to three without imposing any constraints on pixel timing.
Newer standards like HDMI require sending high-speed serial bit streams with many bits of data per pixel. That in turn requires higher data rates and thus faster switching rates than could be accommodated with older technologies.
add a comment |
Before VGA was invented, CGA RGBI used 4 wires to get 16 colors and EGA used 6 wires to get 64 colors on the cable. Add 3 more for sync signals and signal ground, and this won't be an issue, simple cables and connectors exist for getting EGA's about 16.3 million pixels per second digital signals over to the monitor easily.
Amiga was released. It supports 4 bits per color component, or 12 bits of color for 4096 colors. It can be connected to a standard color TV directly (via SCART analog RGB), and the RGB is also used to generate color composite and RF outputs. It also has digital RGBI interface, but only 16 colors so why use it, analog RGB is much better, and even composite with single RCA connection is tolerable.
Enter VGA. You have up to 8 bits per pixel at about 28.3 million pixels per second. Digital video links such as LVDS or SDI have not been invented yet, because there has been no need for them yet. But RAMDACs exist. What any sane designer would do is to add one of these to offer a color look-up table as well, and to generate 6 bit output per component, or 18 bits per pixel, directly converted to analog RGB. This means no huge amount of digital wires are needed, just 3 coaxial cables for analog video, plus 2 wires for sync. Plus the monitor can just eat analog signals as digital signals have to be converted to analog anyway for driving CRT guns. This just made no sense to connect VGA digitally, or the number of colors or color palette lookup feature would have to be discarded.
Besides VGA, others use analog connections too (SUN/SGI, Apple..). Resolutions increase, refresh rates increase, color depths increase. Transmitting high bandwidth analog video needs more precision and engineering to have acceptable quality. Meanwhile, LVDS was invented, finds its use in LCD panels as a digital video link. Laptops need LCD panels so they have LVDS chips after color look-up table, so LVDS gets eventually integrated to chipsets. Finally CRT monitors start to be replaced with LCD displays. As the source and display device are digital, the high quality analog link between them could be eliminated cheaply with digital interface to avoid extra conversions. First using the laptop-based LVDS technology in various forms, but those were just external versions of internal laptop display links, so very early on TMDS-based DVI link replaced these. So more bandwith and quality is much more easily and cheaply gotten over digital link than doing analog conversions on both ends, since they are not even necessary. The rest is history.
add a comment |
I believe it was cost and terminal emulation. The RGB monitors were very expensive at the time and would not work (easily) as dumb terminals. IBM wanted a "low" cost display that was double the resolution of CGA and that could be used to emulate their mainframe 3270 terminals. EGA was a temporary stop-gap design while work continued on VGA - 640x480 graphics and sharp 80 col x 25 row display in "B&W" and color. Lotus 123 never looked so good as did the 3270 display on a VGA screen.
Business PCs sales really took off when VGA hit the streets. You could now use one display to do PC work and emulate an IBM mainframe 3270 terminal. Dec VT100/200 emulation and Burroughs (Unisys) B6000/A10 TDI emulation soon followed. VGA was the key to having only one terminal at your desk. I was in banking in Chicago at the time and we couldn't get enough IBM XTs with VGA and 3270 cards. We made a lot of money using those 640x480 VGA monitors for PC work and dumb terminals. Networks based on Ethernet were not mainstream yet - Mainframes paid the bills.
New contributor
add a comment |
That question is build on somewhat weak ground. After all, most of the video formats you list as 'digital' aren't such - or at least not more as any of the analogue ones are. Just because an output like RGBI uses two level per channel doesn't make it digital. They only feature a restricted number of signal levels.
Digital signalling only started with formats like DVI/HDMI/etc., were signal data was not transferred as levels but a data stream. Everything before isn't digital.
Now, the decision to use RGBI is one about saving components to encode the signals by some complex modulation (like FBAS or SVIDEO) just to add even more components to the display to decode it again.
And yes, as snips-n-snails suspects, it's all about amount of data - just not so much about memory than transfer bandwidth. Using multiple lines increases bandwidth without increasing data rate. Increasing values per step (and lane) as well increases bandwidth without increasing signaling steps.
Here also hides the reason why it took so long to switch for real digital transfer - analogue already delivered an awesome amount of bandwidth. But with increasing bandwidth even further, hard to compensate signal distortion grew as well.
Going digital offers the solution by separating transfer from content. Distortion on transfer level are compensated or corrected) without influence to the data (and thus the image) transferred.
"A digital signal is a signal that is being used to represent data as a sequence of discrete values; at any given time it can only take on one of a finite number of values." - en.wikipedia.org/wiki/Digital_signal
– Bruce Abbott
2 hours ago
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "648"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f11776%2fwhy-did-computer-video-outputs-go-from-digital-to-analog-then-back-to-digital%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
5 Answers
5
active
oldest
votes
5 Answers
5
active
oldest
votes
active
oldest
votes
active
oldest
votes
Early digital video outputs, like CGA, were not really akin to the later standards such as DVI and its follow-on's. The reason for using multiple lines to carry the different analog portions of the signal to the monitor was to prevent crosstalk interference of these signals.
You can see this in the very early computers like the Commodore 64 and Atari 800 separating the luminance and chrominance signals to produce a cleaner display. Similary, CGA (RGBI) separated each color component from the sync components in order to deliver a cleaner version of all those signals to the CRT, minimizing crosstalk. VGA, which was not limited to only 4 bits for representing color information, continued this approach of separate signal lines for color and sync.
Remember, within the CRT, there are only 3 color signals possible for controlling the 3 electron guns that will produce RGB on the display phosphors. So when VGA took the technology from CGA's 2^4 possible colors to VGA's 2^18 possible colors, it was still only necessary to separate the R from the G and the B signals. There would be no point to using 18 separate lines to carry the 18 bits of digital color data to only 3 electron guns in the CRT.
So, the more essential technology change was the move from CRT's, which are analog peripherals, to LCD screens, which incorporate more digital processing in controlling their individual pixel intensities. With LCD's, obviously there are no purely analog electron guns assigned to the primary colors. This fact, coupled with excellent advances in the bandwidth of serial communications channels, allows modern displays to carry the full digital display signal in its computer-native form to the display's processor.
add a comment |
Early digital video outputs, like CGA, were not really akin to the later standards such as DVI and its follow-on's. The reason for using multiple lines to carry the different analog portions of the signal to the monitor was to prevent crosstalk interference of these signals.
You can see this in the very early computers like the Commodore 64 and Atari 800 separating the luminance and chrominance signals to produce a cleaner display. Similary, CGA (RGBI) separated each color component from the sync components in order to deliver a cleaner version of all those signals to the CRT, minimizing crosstalk. VGA, which was not limited to only 4 bits for representing color information, continued this approach of separate signal lines for color and sync.
Remember, within the CRT, there are only 3 color signals possible for controlling the 3 electron guns that will produce RGB on the display phosphors. So when VGA took the technology from CGA's 2^4 possible colors to VGA's 2^18 possible colors, it was still only necessary to separate the R from the G and the B signals. There would be no point to using 18 separate lines to carry the 18 bits of digital color data to only 3 electron guns in the CRT.
So, the more essential technology change was the move from CRT's, which are analog peripherals, to LCD screens, which incorporate more digital processing in controlling their individual pixel intensities. With LCD's, obviously there are no purely analog electron guns assigned to the primary colors. This fact, coupled with excellent advances in the bandwidth of serial communications channels, allows modern displays to carry the full digital display signal in its computer-native form to the display's processor.
add a comment |
Early digital video outputs, like CGA, were not really akin to the later standards such as DVI and its follow-on's. The reason for using multiple lines to carry the different analog portions of the signal to the monitor was to prevent crosstalk interference of these signals.
You can see this in the very early computers like the Commodore 64 and Atari 800 separating the luminance and chrominance signals to produce a cleaner display. Similary, CGA (RGBI) separated each color component from the sync components in order to deliver a cleaner version of all those signals to the CRT, minimizing crosstalk. VGA, which was not limited to only 4 bits for representing color information, continued this approach of separate signal lines for color and sync.
Remember, within the CRT, there are only 3 color signals possible for controlling the 3 electron guns that will produce RGB on the display phosphors. So when VGA took the technology from CGA's 2^4 possible colors to VGA's 2^18 possible colors, it was still only necessary to separate the R from the G and the B signals. There would be no point to using 18 separate lines to carry the 18 bits of digital color data to only 3 electron guns in the CRT.
So, the more essential technology change was the move from CRT's, which are analog peripherals, to LCD screens, which incorporate more digital processing in controlling their individual pixel intensities. With LCD's, obviously there are no purely analog electron guns assigned to the primary colors. This fact, coupled with excellent advances in the bandwidth of serial communications channels, allows modern displays to carry the full digital display signal in its computer-native form to the display's processor.
Early digital video outputs, like CGA, were not really akin to the later standards such as DVI and its follow-on's. The reason for using multiple lines to carry the different analog portions of the signal to the monitor was to prevent crosstalk interference of these signals.
You can see this in the very early computers like the Commodore 64 and Atari 800 separating the luminance and chrominance signals to produce a cleaner display. Similary, CGA (RGBI) separated each color component from the sync components in order to deliver a cleaner version of all those signals to the CRT, minimizing crosstalk. VGA, which was not limited to only 4 bits for representing color information, continued this approach of separate signal lines for color and sync.
Remember, within the CRT, there are only 3 color signals possible for controlling the 3 electron guns that will produce RGB on the display phosphors. So when VGA took the technology from CGA's 2^4 possible colors to VGA's 2^18 possible colors, it was still only necessary to separate the R from the G and the B signals. There would be no point to using 18 separate lines to carry the 18 bits of digital color data to only 3 electron guns in the CRT.
So, the more essential technology change was the move from CRT's, which are analog peripherals, to LCD screens, which incorporate more digital processing in controlling their individual pixel intensities. With LCD's, obviously there are no purely analog electron guns assigned to the primary colors. This fact, coupled with excellent advances in the bandwidth of serial communications channels, allows modern displays to carry the full digital display signal in its computer-native form to the display's processor.
edited 4 hours ago
answered 8 hours ago
Brian HBrian H
20.8k77 silver badges180 bronze badges
20.8k77 silver badges180 bronze badges
add a comment |
add a comment |
Older monitors had analog timing and, with one notable exception, were designed so that the signals presented on their inputs at any moment would fully describe the color to be displayed at that moment. The only monitors that used any sort of time multiplexing on their inputs were those that used the same analog composite color encoding methods used in broadcast television receivers (and typically used much of the same circuitry to decode the signals).
Older "digital" monitors would need to send video signals over enough wires to identify every possible color as a binary number. For a 64-color monitor like an EGA display, that required using six signals, and for a 262,144-color monitor like the VGA it would have required 18--too many to be practical. Using an analog RGB or YUV (today called "component video") signal reduced the number of signals required to three without imposing any constraints on pixel timing.
Newer standards like HDMI require sending high-speed serial bit streams with many bits of data per pixel. That in turn requires higher data rates and thus faster switching rates than could be accommodated with older technologies.
add a comment |
Older monitors had analog timing and, with one notable exception, were designed so that the signals presented on their inputs at any moment would fully describe the color to be displayed at that moment. The only monitors that used any sort of time multiplexing on their inputs were those that used the same analog composite color encoding methods used in broadcast television receivers (and typically used much of the same circuitry to decode the signals).
Older "digital" monitors would need to send video signals over enough wires to identify every possible color as a binary number. For a 64-color monitor like an EGA display, that required using six signals, and for a 262,144-color monitor like the VGA it would have required 18--too many to be practical. Using an analog RGB or YUV (today called "component video") signal reduced the number of signals required to three without imposing any constraints on pixel timing.
Newer standards like HDMI require sending high-speed serial bit streams with many bits of data per pixel. That in turn requires higher data rates and thus faster switching rates than could be accommodated with older technologies.
add a comment |
Older monitors had analog timing and, with one notable exception, were designed so that the signals presented on their inputs at any moment would fully describe the color to be displayed at that moment. The only monitors that used any sort of time multiplexing on their inputs were those that used the same analog composite color encoding methods used in broadcast television receivers (and typically used much of the same circuitry to decode the signals).
Older "digital" monitors would need to send video signals over enough wires to identify every possible color as a binary number. For a 64-color monitor like an EGA display, that required using six signals, and for a 262,144-color monitor like the VGA it would have required 18--too many to be practical. Using an analog RGB or YUV (today called "component video") signal reduced the number of signals required to three without imposing any constraints on pixel timing.
Newer standards like HDMI require sending high-speed serial bit streams with many bits of data per pixel. That in turn requires higher data rates and thus faster switching rates than could be accommodated with older technologies.
Older monitors had analog timing and, with one notable exception, were designed so that the signals presented on their inputs at any moment would fully describe the color to be displayed at that moment. The only monitors that used any sort of time multiplexing on their inputs were those that used the same analog composite color encoding methods used in broadcast television receivers (and typically used much of the same circuitry to decode the signals).
Older "digital" monitors would need to send video signals over enough wires to identify every possible color as a binary number. For a 64-color monitor like an EGA display, that required using six signals, and for a 262,144-color monitor like the VGA it would have required 18--too many to be practical. Using an analog RGB or YUV (today called "component video") signal reduced the number of signals required to three without imposing any constraints on pixel timing.
Newer standards like HDMI require sending high-speed serial bit streams with many bits of data per pixel. That in turn requires higher data rates and thus faster switching rates than could be accommodated with older technologies.
answered 6 hours ago
supercatsupercat
10.4k2 gold badges14 silver badges47 bronze badges
10.4k2 gold badges14 silver badges47 bronze badges
add a comment |
add a comment |
Before VGA was invented, CGA RGBI used 4 wires to get 16 colors and EGA used 6 wires to get 64 colors on the cable. Add 3 more for sync signals and signal ground, and this won't be an issue, simple cables and connectors exist for getting EGA's about 16.3 million pixels per second digital signals over to the monitor easily.
Amiga was released. It supports 4 bits per color component, or 12 bits of color for 4096 colors. It can be connected to a standard color TV directly (via SCART analog RGB), and the RGB is also used to generate color composite and RF outputs. It also has digital RGBI interface, but only 16 colors so why use it, analog RGB is much better, and even composite with single RCA connection is tolerable.
Enter VGA. You have up to 8 bits per pixel at about 28.3 million pixels per second. Digital video links such as LVDS or SDI have not been invented yet, because there has been no need for them yet. But RAMDACs exist. What any sane designer would do is to add one of these to offer a color look-up table as well, and to generate 6 bit output per component, or 18 bits per pixel, directly converted to analog RGB. This means no huge amount of digital wires are needed, just 3 coaxial cables for analog video, plus 2 wires for sync. Plus the monitor can just eat analog signals as digital signals have to be converted to analog anyway for driving CRT guns. This just made no sense to connect VGA digitally, or the number of colors or color palette lookup feature would have to be discarded.
Besides VGA, others use analog connections too (SUN/SGI, Apple..). Resolutions increase, refresh rates increase, color depths increase. Transmitting high bandwidth analog video needs more precision and engineering to have acceptable quality. Meanwhile, LVDS was invented, finds its use in LCD panels as a digital video link. Laptops need LCD panels so they have LVDS chips after color look-up table, so LVDS gets eventually integrated to chipsets. Finally CRT monitors start to be replaced with LCD displays. As the source and display device are digital, the high quality analog link between them could be eliminated cheaply with digital interface to avoid extra conversions. First using the laptop-based LVDS technology in various forms, but those were just external versions of internal laptop display links, so very early on TMDS-based DVI link replaced these. So more bandwith and quality is much more easily and cheaply gotten over digital link than doing analog conversions on both ends, since they are not even necessary. The rest is history.
add a comment |
Before VGA was invented, CGA RGBI used 4 wires to get 16 colors and EGA used 6 wires to get 64 colors on the cable. Add 3 more for sync signals and signal ground, and this won't be an issue, simple cables and connectors exist for getting EGA's about 16.3 million pixels per second digital signals over to the monitor easily.
Amiga was released. It supports 4 bits per color component, or 12 bits of color for 4096 colors. It can be connected to a standard color TV directly (via SCART analog RGB), and the RGB is also used to generate color composite and RF outputs. It also has digital RGBI interface, but only 16 colors so why use it, analog RGB is much better, and even composite with single RCA connection is tolerable.
Enter VGA. You have up to 8 bits per pixel at about 28.3 million pixels per second. Digital video links such as LVDS or SDI have not been invented yet, because there has been no need for them yet. But RAMDACs exist. What any sane designer would do is to add one of these to offer a color look-up table as well, and to generate 6 bit output per component, or 18 bits per pixel, directly converted to analog RGB. This means no huge amount of digital wires are needed, just 3 coaxial cables for analog video, plus 2 wires for sync. Plus the monitor can just eat analog signals as digital signals have to be converted to analog anyway for driving CRT guns. This just made no sense to connect VGA digitally, or the number of colors or color palette lookup feature would have to be discarded.
Besides VGA, others use analog connections too (SUN/SGI, Apple..). Resolutions increase, refresh rates increase, color depths increase. Transmitting high bandwidth analog video needs more precision and engineering to have acceptable quality. Meanwhile, LVDS was invented, finds its use in LCD panels as a digital video link. Laptops need LCD panels so they have LVDS chips after color look-up table, so LVDS gets eventually integrated to chipsets. Finally CRT monitors start to be replaced with LCD displays. As the source and display device are digital, the high quality analog link between them could be eliminated cheaply with digital interface to avoid extra conversions. First using the laptop-based LVDS technology in various forms, but those were just external versions of internal laptop display links, so very early on TMDS-based DVI link replaced these. So more bandwith and quality is much more easily and cheaply gotten over digital link than doing analog conversions on both ends, since they are not even necessary. The rest is history.
add a comment |
Before VGA was invented, CGA RGBI used 4 wires to get 16 colors and EGA used 6 wires to get 64 colors on the cable. Add 3 more for sync signals and signal ground, and this won't be an issue, simple cables and connectors exist for getting EGA's about 16.3 million pixels per second digital signals over to the monitor easily.
Amiga was released. It supports 4 bits per color component, or 12 bits of color for 4096 colors. It can be connected to a standard color TV directly (via SCART analog RGB), and the RGB is also used to generate color composite and RF outputs. It also has digital RGBI interface, but only 16 colors so why use it, analog RGB is much better, and even composite with single RCA connection is tolerable.
Enter VGA. You have up to 8 bits per pixel at about 28.3 million pixels per second. Digital video links such as LVDS or SDI have not been invented yet, because there has been no need for them yet. But RAMDACs exist. What any sane designer would do is to add one of these to offer a color look-up table as well, and to generate 6 bit output per component, or 18 bits per pixel, directly converted to analog RGB. This means no huge amount of digital wires are needed, just 3 coaxial cables for analog video, plus 2 wires for sync. Plus the monitor can just eat analog signals as digital signals have to be converted to analog anyway for driving CRT guns. This just made no sense to connect VGA digitally, or the number of colors or color palette lookup feature would have to be discarded.
Besides VGA, others use analog connections too (SUN/SGI, Apple..). Resolutions increase, refresh rates increase, color depths increase. Transmitting high bandwidth analog video needs more precision and engineering to have acceptable quality. Meanwhile, LVDS was invented, finds its use in LCD panels as a digital video link. Laptops need LCD panels so they have LVDS chips after color look-up table, so LVDS gets eventually integrated to chipsets. Finally CRT monitors start to be replaced with LCD displays. As the source and display device are digital, the high quality analog link between them could be eliminated cheaply with digital interface to avoid extra conversions. First using the laptop-based LVDS technology in various forms, but those were just external versions of internal laptop display links, so very early on TMDS-based DVI link replaced these. So more bandwith and quality is much more easily and cheaply gotten over digital link than doing analog conversions on both ends, since they are not even necessary. The rest is history.
Before VGA was invented, CGA RGBI used 4 wires to get 16 colors and EGA used 6 wires to get 64 colors on the cable. Add 3 more for sync signals and signal ground, and this won't be an issue, simple cables and connectors exist for getting EGA's about 16.3 million pixels per second digital signals over to the monitor easily.
Amiga was released. It supports 4 bits per color component, or 12 bits of color for 4096 colors. It can be connected to a standard color TV directly (via SCART analog RGB), and the RGB is also used to generate color composite and RF outputs. It also has digital RGBI interface, but only 16 colors so why use it, analog RGB is much better, and even composite with single RCA connection is tolerable.
Enter VGA. You have up to 8 bits per pixel at about 28.3 million pixels per second. Digital video links such as LVDS or SDI have not been invented yet, because there has been no need for them yet. But RAMDACs exist. What any sane designer would do is to add one of these to offer a color look-up table as well, and to generate 6 bit output per component, or 18 bits per pixel, directly converted to analog RGB. This means no huge amount of digital wires are needed, just 3 coaxial cables for analog video, plus 2 wires for sync. Plus the monitor can just eat analog signals as digital signals have to be converted to analog anyway for driving CRT guns. This just made no sense to connect VGA digitally, or the number of colors or color palette lookup feature would have to be discarded.
Besides VGA, others use analog connections too (SUN/SGI, Apple..). Resolutions increase, refresh rates increase, color depths increase. Transmitting high bandwidth analog video needs more precision and engineering to have acceptable quality. Meanwhile, LVDS was invented, finds its use in LCD panels as a digital video link. Laptops need LCD panels so they have LVDS chips after color look-up table, so LVDS gets eventually integrated to chipsets. Finally CRT monitors start to be replaced with LCD displays. As the source and display device are digital, the high quality analog link between them could be eliminated cheaply with digital interface to avoid extra conversions. First using the laptop-based LVDS technology in various forms, but those were just external versions of internal laptop display links, so very early on TMDS-based DVI link replaced these. So more bandwith and quality is much more easily and cheaply gotten over digital link than doing analog conversions on both ends, since they are not even necessary. The rest is history.
answered 6 hours ago
JustmeJustme
1,4273 silver badges11 bronze badges
1,4273 silver badges11 bronze badges
add a comment |
add a comment |
I believe it was cost and terminal emulation. The RGB monitors were very expensive at the time and would not work (easily) as dumb terminals. IBM wanted a "low" cost display that was double the resolution of CGA and that could be used to emulate their mainframe 3270 terminals. EGA was a temporary stop-gap design while work continued on VGA - 640x480 graphics and sharp 80 col x 25 row display in "B&W" and color. Lotus 123 never looked so good as did the 3270 display on a VGA screen.
Business PCs sales really took off when VGA hit the streets. You could now use one display to do PC work and emulate an IBM mainframe 3270 terminal. Dec VT100/200 emulation and Burroughs (Unisys) B6000/A10 TDI emulation soon followed. VGA was the key to having only one terminal at your desk. I was in banking in Chicago at the time and we couldn't get enough IBM XTs with VGA and 3270 cards. We made a lot of money using those 640x480 VGA monitors for PC work and dumb terminals. Networks based on Ethernet were not mainstream yet - Mainframes paid the bills.
New contributor
add a comment |
I believe it was cost and terminal emulation. The RGB monitors were very expensive at the time and would not work (easily) as dumb terminals. IBM wanted a "low" cost display that was double the resolution of CGA and that could be used to emulate their mainframe 3270 terminals. EGA was a temporary stop-gap design while work continued on VGA - 640x480 graphics and sharp 80 col x 25 row display in "B&W" and color. Lotus 123 never looked so good as did the 3270 display on a VGA screen.
Business PCs sales really took off when VGA hit the streets. You could now use one display to do PC work and emulate an IBM mainframe 3270 terminal. Dec VT100/200 emulation and Burroughs (Unisys) B6000/A10 TDI emulation soon followed. VGA was the key to having only one terminal at your desk. I was in banking in Chicago at the time and we couldn't get enough IBM XTs with VGA and 3270 cards. We made a lot of money using those 640x480 VGA monitors for PC work and dumb terminals. Networks based on Ethernet were not mainstream yet - Mainframes paid the bills.
New contributor
add a comment |
I believe it was cost and terminal emulation. The RGB monitors were very expensive at the time and would not work (easily) as dumb terminals. IBM wanted a "low" cost display that was double the resolution of CGA and that could be used to emulate their mainframe 3270 terminals. EGA was a temporary stop-gap design while work continued on VGA - 640x480 graphics and sharp 80 col x 25 row display in "B&W" and color. Lotus 123 never looked so good as did the 3270 display on a VGA screen.
Business PCs sales really took off when VGA hit the streets. You could now use one display to do PC work and emulate an IBM mainframe 3270 terminal. Dec VT100/200 emulation and Burroughs (Unisys) B6000/A10 TDI emulation soon followed. VGA was the key to having only one terminal at your desk. I was in banking in Chicago at the time and we couldn't get enough IBM XTs with VGA and 3270 cards. We made a lot of money using those 640x480 VGA monitors for PC work and dumb terminals. Networks based on Ethernet were not mainstream yet - Mainframes paid the bills.
New contributor
I believe it was cost and terminal emulation. The RGB monitors were very expensive at the time and would not work (easily) as dumb terminals. IBM wanted a "low" cost display that was double the resolution of CGA and that could be used to emulate their mainframe 3270 terminals. EGA was a temporary stop-gap design while work continued on VGA - 640x480 graphics and sharp 80 col x 25 row display in "B&W" and color. Lotus 123 never looked so good as did the 3270 display on a VGA screen.
Business PCs sales really took off when VGA hit the streets. You could now use one display to do PC work and emulate an IBM mainframe 3270 terminal. Dec VT100/200 emulation and Burroughs (Unisys) B6000/A10 TDI emulation soon followed. VGA was the key to having only one terminal at your desk. I was in banking in Chicago at the time and we couldn't get enough IBM XTs with VGA and 3270 cards. We made a lot of money using those 640x480 VGA monitors for PC work and dumb terminals. Networks based on Ethernet were not mainstream yet - Mainframes paid the bills.
New contributor
New contributor
answered 7 hours ago
AoresteenAoresteen
212 bronze badges
212 bronze badges
New contributor
New contributor
add a comment |
add a comment |
That question is build on somewhat weak ground. After all, most of the video formats you list as 'digital' aren't such - or at least not more as any of the analogue ones are. Just because an output like RGBI uses two level per channel doesn't make it digital. They only feature a restricted number of signal levels.
Digital signalling only started with formats like DVI/HDMI/etc., were signal data was not transferred as levels but a data stream. Everything before isn't digital.
Now, the decision to use RGBI is one about saving components to encode the signals by some complex modulation (like FBAS or SVIDEO) just to add even more components to the display to decode it again.
And yes, as snips-n-snails suspects, it's all about amount of data - just not so much about memory than transfer bandwidth. Using multiple lines increases bandwidth without increasing data rate. Increasing values per step (and lane) as well increases bandwidth without increasing signaling steps.
Here also hides the reason why it took so long to switch for real digital transfer - analogue already delivered an awesome amount of bandwidth. But with increasing bandwidth even further, hard to compensate signal distortion grew as well.
Going digital offers the solution by separating transfer from content. Distortion on transfer level are compensated or corrected) without influence to the data (and thus the image) transferred.
"A digital signal is a signal that is being used to represent data as a sequence of discrete values; at any given time it can only take on one of a finite number of values." - en.wikipedia.org/wiki/Digital_signal
– Bruce Abbott
2 hours ago
add a comment |
That question is build on somewhat weak ground. After all, most of the video formats you list as 'digital' aren't such - or at least not more as any of the analogue ones are. Just because an output like RGBI uses two level per channel doesn't make it digital. They only feature a restricted number of signal levels.
Digital signalling only started with formats like DVI/HDMI/etc., were signal data was not transferred as levels but a data stream. Everything before isn't digital.
Now, the decision to use RGBI is one about saving components to encode the signals by some complex modulation (like FBAS or SVIDEO) just to add even more components to the display to decode it again.
And yes, as snips-n-snails suspects, it's all about amount of data - just not so much about memory than transfer bandwidth. Using multiple lines increases bandwidth without increasing data rate. Increasing values per step (and lane) as well increases bandwidth without increasing signaling steps.
Here also hides the reason why it took so long to switch for real digital transfer - analogue already delivered an awesome amount of bandwidth. But with increasing bandwidth even further, hard to compensate signal distortion grew as well.
Going digital offers the solution by separating transfer from content. Distortion on transfer level are compensated or corrected) without influence to the data (and thus the image) transferred.
"A digital signal is a signal that is being used to represent data as a sequence of discrete values; at any given time it can only take on one of a finite number of values." - en.wikipedia.org/wiki/Digital_signal
– Bruce Abbott
2 hours ago
add a comment |
That question is build on somewhat weak ground. After all, most of the video formats you list as 'digital' aren't such - or at least not more as any of the analogue ones are. Just because an output like RGBI uses two level per channel doesn't make it digital. They only feature a restricted number of signal levels.
Digital signalling only started with formats like DVI/HDMI/etc., were signal data was not transferred as levels but a data stream. Everything before isn't digital.
Now, the decision to use RGBI is one about saving components to encode the signals by some complex modulation (like FBAS or SVIDEO) just to add even more components to the display to decode it again.
And yes, as snips-n-snails suspects, it's all about amount of data - just not so much about memory than transfer bandwidth. Using multiple lines increases bandwidth without increasing data rate. Increasing values per step (and lane) as well increases bandwidth without increasing signaling steps.
Here also hides the reason why it took so long to switch for real digital transfer - analogue already delivered an awesome amount of bandwidth. But with increasing bandwidth even further, hard to compensate signal distortion grew as well.
Going digital offers the solution by separating transfer from content. Distortion on transfer level are compensated or corrected) without influence to the data (and thus the image) transferred.
That question is build on somewhat weak ground. After all, most of the video formats you list as 'digital' aren't such - or at least not more as any of the analogue ones are. Just because an output like RGBI uses two level per channel doesn't make it digital. They only feature a restricted number of signal levels.
Digital signalling only started with formats like DVI/HDMI/etc., were signal data was not transferred as levels but a data stream. Everything before isn't digital.
Now, the decision to use RGBI is one about saving components to encode the signals by some complex modulation (like FBAS or SVIDEO) just to add even more components to the display to decode it again.
And yes, as snips-n-snails suspects, it's all about amount of data - just not so much about memory than transfer bandwidth. Using multiple lines increases bandwidth without increasing data rate. Increasing values per step (and lane) as well increases bandwidth without increasing signaling steps.
Here also hides the reason why it took so long to switch for real digital transfer - analogue already delivered an awesome amount of bandwidth. But with increasing bandwidth even further, hard to compensate signal distortion grew as well.
Going digital offers the solution by separating transfer from content. Distortion on transfer level are compensated or corrected) without influence to the data (and thus the image) transferred.
edited 6 hours ago
answered 7 hours ago
RaffzahnRaffzahn
64.6k6 gold badges158 silver badges266 bronze badges
64.6k6 gold badges158 silver badges266 bronze badges
"A digital signal is a signal that is being used to represent data as a sequence of discrete values; at any given time it can only take on one of a finite number of values." - en.wikipedia.org/wiki/Digital_signal
– Bruce Abbott
2 hours ago
add a comment |
"A digital signal is a signal that is being used to represent data as a sequence of discrete values; at any given time it can only take on one of a finite number of values." - en.wikipedia.org/wiki/Digital_signal
– Bruce Abbott
2 hours ago
"A digital signal is a signal that is being used to represent data as a sequence of discrete values; at any given time it can only take on one of a finite number of values." - en.wikipedia.org/wiki/Digital_signal
– Bruce Abbott
2 hours ago
"A digital signal is a signal that is being used to represent data as a sequence of discrete values; at any given time it can only take on one of a finite number of values." - en.wikipedia.org/wiki/Digital_signal
– Bruce Abbott
2 hours ago
add a comment |
Thanks for contributing an answer to Retrocomputing Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f11776%2fwhy-did-computer-video-outputs-go-from-digital-to-analog-then-back-to-digital%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
The answer to "why RGBI in 1981 instead of jumping straight to RGBHV" is probably going to be something boring like, "because memory was expensive and there wasn't a market for >16 colors that justified adding more wires to the monitor cable".
– snips-n-snails
7 hours ago