Non-visual Computers - thoughts?How would an aquatic race develop computers?Perfect logic implementation by computersCan people upload their consciousnesses to computers?Just How Powerful Could a Mechanical Computer Be?Computers without semi-conductor technologyComputers in a society whose mathematics is limited to geometyWhat is the most reasonable way for non-binary computers to have become standard?Could an alien evolve to speak through its anus?How must computers/telecommunication deviate from reality for a retrofuturistic setting?Minimum tech level to invent core memory

Shouldn't the "credit score" prevent Americans from going deeper and deeper into personal debt?

How do applicants for an NSF fellowship come up with a research plan for the research statement part of the applications

Sleeping solo in a double sleeping bag

Can pay be witheld for hours cleaning up after closing time?

Accent on í misaligned in bibliography / citation

Would this system work to purify water?

Why isn't "I've" a proper response?

How much code would a codegolf golf if a codegolf could golf code?

Why does The Ancient One think differently about Doctor Strange in Endgame than the film Doctor Strange?

What to say to a student who has failed?

Why is Boris Johnson visiting only Paris & Berlin if every member of the EU needs to agree on a withdrawal deal?

Did the British navy fail to take into account the ballistics correction due to Coriolis force during WW1 Falkland Islands battle?

Is using a hyperlink to close a modal a poor design decision?

Why in most German places is the church the tallest building?

Why were the crew so desperate to catch Truman and return him to Seahaven?

Is “I am getting married with my sister” ambiguous?

Does a face-down creature with morph retain its damage when it is turned face up?

Compelling story with the world as a villain

Is a player able to change alignment midway through an adventure?

Did a flight controller ever answer Flight with a no-go?

How is the list of apps allowed to install another apps populated?

Why were movies shot on film shot at 24 frames per second?

Rule based coloured background for labeling in QGIS

What is the difference between true neutral and unaligned?



Non-visual Computers - thoughts?


How would an aquatic race develop computers?Perfect logic implementation by computersCan people upload their consciousnesses to computers?Just How Powerful Could a Mechanical Computer Be?Computers without semi-conductor technologyComputers in a society whose mathematics is limited to geometyWhat is the most reasonable way for non-binary computers to have become standard?Could an alien evolve to speak through its anus?How must computers/telecommunication deviate from reality for a retrofuturistic setting?Minimum tech level to invent core memory






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








6












$begingroup$


TL;DR



Let's say that computers were invented at a school for the blind in mid-1800s. How would today's technology, based on these non-screen-based computers, be different?



ETA: To clarify and hopefully narrow this enough -- I'm assuming that both the newer products will be influenced by functional computers predating movies & television: radio & telegraph/telephone may be more of the communication models. Also, that just like numpads on phones and computer keyboards are arranged differently due to vestigial bits from their separate origins, and our "Save" icon may confuse those who hadn't grown up with 3.5" floppies (I'm from the 5.25" era myself - Apple //c!), and we still call that thing in a car a "glove compartment" despite not wearing specific driving clothing any more).



So while sighted potential users greatly outnumber the blind ones, they're from a world where computers have always been fully accessible to the blind (so accessibility is not an afterthought), and that has probably driven the development of the CS field for quite a while.



Background elements



Braille had already been invented by the early 19th century, and it was derived from a military application (Night Writing, for Napolean's army) -- much like our computers (stored programs, some of the more theoretical elements were codified during WWII ) https://en.wikipedia.org/wiki/Braille



Punch Cards for Weaving had been invented in 1803 -- for a while schools for the blind were often trade schools-- the first one (https://en.wikipedia.org/wiki/Institut_National_des_Jeunes_Aveugles (first school for the blind was also named "National Institute of the working blinds", and was famous for graduating Organists.)



So now let's say they got an early Jacquard Loom head type machine (instead of Organs)
from https://en.wikipedia.org/wiki/Jacquard_loom#Importance_in_computing




... The ability to change the pattern of the loom's weave by simply changing cards was an important conceptual precursor to the development of computer programming and data entry. Charles Babbage knew of Jacquard looms and planned to use cards to store programs in his Analytical Engine. In the late 19th century, Herman Hollerith took the idea of using punched cards to store information a step further when he created a punched card tabulating machine which he used to input data for the 1890 U.S. Census.




(Note that this Loom appears to also be a French invention.)



Charles Babbage & Analytical Engine - according to Wikipedia (sorry that I keep going back to that source, but I'm assembling fragments of things I thought I knew or picked up (I'm no tech historian), and Wikipedia's the easiest place to assemble the threads.) -- he was self-taught from reading many mathematicians, some of which were French, and was definitely fighting the British Establishment.



from https://en.wikipedia.org/wiki/Charles_Babbage#Computing_pioneer




While Babbage's machines were mechanical and unwieldy, their basic architecture was similar to a modern computer. The data and program memory were separated, operation was instruction-based, the control unit could make conditional jumps, and the machine had a separate I/O unit




So a computer doesn't need to be print-derivative



We have punched cards (tangible, non-alphabetic) manipulating rules and representations of numbers. As Ada Lovelace said:




"We may say most aptly that the Analytical Engine weaves algebraical patterns just as the Jacquard loom weaves flowers and leaves."




So what if punched cards went in, braille results came out? Things may have stayed mechanical longer, instead of moving to processors as we know them, but there'd also be almost a century's extra progress. "Screens" may have moved to Refreshable Braille Displays - but would there be "windows" and other simultaneous processing?



The "World Building" or AltHistory part --



Just like the Internet was very US-focused in the beginning, so it has some legacy effects on domain names and rules, perhaps in this world, the computer world (and thus internet?) were dominated by French research, and blind computer scientists. Look at the Minitel for an example of France being way ahead of the curve! They started as phone-book replacements, but provided message boards and finance stuff.



why I'm asking



I'm documenting navigation of applications designed with minimal concern for accessibility. My particular job seems to be describing how to navigate web applications for screen readers. Screen Readers (which read aloud text to blind/low-vision computer users) address everything in a pretty linear way. (Also, we have to keep all navigation keyboard-focused -- it's more predictable than a mouse.)



When windows pop-up, where did the focus go? Do the users know there's a new dialog on screen? Where does the focus go when the error message goes away? (To the last place it was, to the line with the error, or to the top of the page?) It's easy for the sighted to notice a missing field, the blinking cursor, or that something changed on the screen: but if the default were audible and tactile? How would the interfaces change?



What different communication elements may be emphasized? Would casual computers (like cell phones) do the same things or different ones?



I know answers could go in a steam-punk way, but they doesn't have to, or the proposed tech doesn't need to stay that way.










share|improve this question











$endgroup$









  • 2




    $begingroup$
    Seems like we would all learn braille in school, and handwashing before using a keyboard/braille-output would be socially enforced. Braille is not a fast way to transmit information, so brevity would be valued over format, and formatting must add value or context to the message. It's a whole different way of thinking about communication.
    $endgroup$
    – user535733
    7 hours ago







  • 1




    $begingroup$
    "When windows pop-up, where did the focus go?" The focus went where it went. The screen reader should not try to guess; it should of course ask the underlying windowing system which window has the focus. "Do the users know there's a new dialog on screen?" Usually, but definitely not always. I have typed inappropriate input in the wrong window many many times.
    $endgroup$
    – AlexP
    6 hours ago










  • $begingroup$
    For the sighted, the focus is normally obvious. When programmers don't make the next focus clear, it can be confusing. That's a lot of my job: explaining where the focus probably is, or how to tell where it is, and if a pop-up needs to be listened to or can be dismissed (and which shortcut keys will dismiss it.)
    $endgroup$
    – April
    5 hours ago










  • $begingroup$
    You've given a good deal of information to reflect on for people to give an answer, but if it's going to become a mass-market product, it'll develop the way the market dictates. Without clearly defining the way the market works in your world (ie. at it's most basic level is it demand-lead or supply lead and to what degree), and defining patent law and the likely outcome in legal systems (global) of stealing tech (Apple got away with lots of "adaptations" of the Unix OS), this question becomes too broad and opinion based. To give you time to edit, voting to put on hold as too broad.
    $endgroup$
    – Chickens are not cows
    5 hours ago






  • 1




    $begingroup$
    "For the sighted, the focus is normally obvious": that used to be the case, but then Windows 10 came and now it isn't all that obvious... Sometimes the window with focus has a colored titlebar, sometimes it doesn't, sometimes it doesn't have a titlebar at all. I've learned to anticipate where the focus is and always click pre-emptively in the window which I think has the focus to confirm that the text insertion cursor is there.
    $endgroup$
    – AlexP
    4 hours ago


















6












$begingroup$


TL;DR



Let's say that computers were invented at a school for the blind in mid-1800s. How would today's technology, based on these non-screen-based computers, be different?



ETA: To clarify and hopefully narrow this enough -- I'm assuming that both the newer products will be influenced by functional computers predating movies & television: radio & telegraph/telephone may be more of the communication models. Also, that just like numpads on phones and computer keyboards are arranged differently due to vestigial bits from their separate origins, and our "Save" icon may confuse those who hadn't grown up with 3.5" floppies (I'm from the 5.25" era myself - Apple //c!), and we still call that thing in a car a "glove compartment" despite not wearing specific driving clothing any more).



So while sighted potential users greatly outnumber the blind ones, they're from a world where computers have always been fully accessible to the blind (so accessibility is not an afterthought), and that has probably driven the development of the CS field for quite a while.



Background elements



Braille had already been invented by the early 19th century, and it was derived from a military application (Night Writing, for Napolean's army) -- much like our computers (stored programs, some of the more theoretical elements were codified during WWII ) https://en.wikipedia.org/wiki/Braille



Punch Cards for Weaving had been invented in 1803 -- for a while schools for the blind were often trade schools-- the first one (https://en.wikipedia.org/wiki/Institut_National_des_Jeunes_Aveugles (first school for the blind was also named "National Institute of the working blinds", and was famous for graduating Organists.)



So now let's say they got an early Jacquard Loom head type machine (instead of Organs)
from https://en.wikipedia.org/wiki/Jacquard_loom#Importance_in_computing




... The ability to change the pattern of the loom's weave by simply changing cards was an important conceptual precursor to the development of computer programming and data entry. Charles Babbage knew of Jacquard looms and planned to use cards to store programs in his Analytical Engine. In the late 19th century, Herman Hollerith took the idea of using punched cards to store information a step further when he created a punched card tabulating machine which he used to input data for the 1890 U.S. Census.




(Note that this Loom appears to also be a French invention.)



Charles Babbage & Analytical Engine - according to Wikipedia (sorry that I keep going back to that source, but I'm assembling fragments of things I thought I knew or picked up (I'm no tech historian), and Wikipedia's the easiest place to assemble the threads.) -- he was self-taught from reading many mathematicians, some of which were French, and was definitely fighting the British Establishment.



from https://en.wikipedia.org/wiki/Charles_Babbage#Computing_pioneer




While Babbage's machines were mechanical and unwieldy, their basic architecture was similar to a modern computer. The data and program memory were separated, operation was instruction-based, the control unit could make conditional jumps, and the machine had a separate I/O unit




So a computer doesn't need to be print-derivative



We have punched cards (tangible, non-alphabetic) manipulating rules and representations of numbers. As Ada Lovelace said:




"We may say most aptly that the Analytical Engine weaves algebraical patterns just as the Jacquard loom weaves flowers and leaves."




So what if punched cards went in, braille results came out? Things may have stayed mechanical longer, instead of moving to processors as we know them, but there'd also be almost a century's extra progress. "Screens" may have moved to Refreshable Braille Displays - but would there be "windows" and other simultaneous processing?



The "World Building" or AltHistory part --



Just like the Internet was very US-focused in the beginning, so it has some legacy effects on domain names and rules, perhaps in this world, the computer world (and thus internet?) were dominated by French research, and blind computer scientists. Look at the Minitel for an example of France being way ahead of the curve! They started as phone-book replacements, but provided message boards and finance stuff.



why I'm asking



I'm documenting navigation of applications designed with minimal concern for accessibility. My particular job seems to be describing how to navigate web applications for screen readers. Screen Readers (which read aloud text to blind/low-vision computer users) address everything in a pretty linear way. (Also, we have to keep all navigation keyboard-focused -- it's more predictable than a mouse.)



When windows pop-up, where did the focus go? Do the users know there's a new dialog on screen? Where does the focus go when the error message goes away? (To the last place it was, to the line with the error, or to the top of the page?) It's easy for the sighted to notice a missing field, the blinking cursor, or that something changed on the screen: but if the default were audible and tactile? How would the interfaces change?



What different communication elements may be emphasized? Would casual computers (like cell phones) do the same things or different ones?



I know answers could go in a steam-punk way, but they doesn't have to, or the proposed tech doesn't need to stay that way.










share|improve this question











$endgroup$









  • 2




    $begingroup$
    Seems like we would all learn braille in school, and handwashing before using a keyboard/braille-output would be socially enforced. Braille is not a fast way to transmit information, so brevity would be valued over format, and formatting must add value or context to the message. It's a whole different way of thinking about communication.
    $endgroup$
    – user535733
    7 hours ago







  • 1




    $begingroup$
    "When windows pop-up, where did the focus go?" The focus went where it went. The screen reader should not try to guess; it should of course ask the underlying windowing system which window has the focus. "Do the users know there's a new dialog on screen?" Usually, but definitely not always. I have typed inappropriate input in the wrong window many many times.
    $endgroup$
    – AlexP
    6 hours ago










  • $begingroup$
    For the sighted, the focus is normally obvious. When programmers don't make the next focus clear, it can be confusing. That's a lot of my job: explaining where the focus probably is, or how to tell where it is, and if a pop-up needs to be listened to or can be dismissed (and which shortcut keys will dismiss it.)
    $endgroup$
    – April
    5 hours ago










  • $begingroup$
    You've given a good deal of information to reflect on for people to give an answer, but if it's going to become a mass-market product, it'll develop the way the market dictates. Without clearly defining the way the market works in your world (ie. at it's most basic level is it demand-lead or supply lead and to what degree), and defining patent law and the likely outcome in legal systems (global) of stealing tech (Apple got away with lots of "adaptations" of the Unix OS), this question becomes too broad and opinion based. To give you time to edit, voting to put on hold as too broad.
    $endgroup$
    – Chickens are not cows
    5 hours ago






  • 1




    $begingroup$
    "For the sighted, the focus is normally obvious": that used to be the case, but then Windows 10 came and now it isn't all that obvious... Sometimes the window with focus has a colored titlebar, sometimes it doesn't, sometimes it doesn't have a titlebar at all. I've learned to anticipate where the focus is and always click pre-emptively in the window which I think has the focus to confirm that the text insertion cursor is there.
    $endgroup$
    – AlexP
    4 hours ago














6












6








6


2



$begingroup$


TL;DR



Let's say that computers were invented at a school for the blind in mid-1800s. How would today's technology, based on these non-screen-based computers, be different?



ETA: To clarify and hopefully narrow this enough -- I'm assuming that both the newer products will be influenced by functional computers predating movies & television: radio & telegraph/telephone may be more of the communication models. Also, that just like numpads on phones and computer keyboards are arranged differently due to vestigial bits from their separate origins, and our "Save" icon may confuse those who hadn't grown up with 3.5" floppies (I'm from the 5.25" era myself - Apple //c!), and we still call that thing in a car a "glove compartment" despite not wearing specific driving clothing any more).



So while sighted potential users greatly outnumber the blind ones, they're from a world where computers have always been fully accessible to the blind (so accessibility is not an afterthought), and that has probably driven the development of the CS field for quite a while.



Background elements



Braille had already been invented by the early 19th century, and it was derived from a military application (Night Writing, for Napolean's army) -- much like our computers (stored programs, some of the more theoretical elements were codified during WWII ) https://en.wikipedia.org/wiki/Braille



Punch Cards for Weaving had been invented in 1803 -- for a while schools for the blind were often trade schools-- the first one (https://en.wikipedia.org/wiki/Institut_National_des_Jeunes_Aveugles (first school for the blind was also named "National Institute of the working blinds", and was famous for graduating Organists.)



So now let's say they got an early Jacquard Loom head type machine (instead of Organs)
from https://en.wikipedia.org/wiki/Jacquard_loom#Importance_in_computing




... The ability to change the pattern of the loom's weave by simply changing cards was an important conceptual precursor to the development of computer programming and data entry. Charles Babbage knew of Jacquard looms and planned to use cards to store programs in his Analytical Engine. In the late 19th century, Herman Hollerith took the idea of using punched cards to store information a step further when he created a punched card tabulating machine which he used to input data for the 1890 U.S. Census.




(Note that this Loom appears to also be a French invention.)



Charles Babbage & Analytical Engine - according to Wikipedia (sorry that I keep going back to that source, but I'm assembling fragments of things I thought I knew or picked up (I'm no tech historian), and Wikipedia's the easiest place to assemble the threads.) -- he was self-taught from reading many mathematicians, some of which were French, and was definitely fighting the British Establishment.



from https://en.wikipedia.org/wiki/Charles_Babbage#Computing_pioneer




While Babbage's machines were mechanical and unwieldy, their basic architecture was similar to a modern computer. The data and program memory were separated, operation was instruction-based, the control unit could make conditional jumps, and the machine had a separate I/O unit




So a computer doesn't need to be print-derivative



We have punched cards (tangible, non-alphabetic) manipulating rules and representations of numbers. As Ada Lovelace said:




"We may say most aptly that the Analytical Engine weaves algebraical patterns just as the Jacquard loom weaves flowers and leaves."




So what if punched cards went in, braille results came out? Things may have stayed mechanical longer, instead of moving to processors as we know them, but there'd also be almost a century's extra progress. "Screens" may have moved to Refreshable Braille Displays - but would there be "windows" and other simultaneous processing?



The "World Building" or AltHistory part --



Just like the Internet was very US-focused in the beginning, so it has some legacy effects on domain names and rules, perhaps in this world, the computer world (and thus internet?) were dominated by French research, and blind computer scientists. Look at the Minitel for an example of France being way ahead of the curve! They started as phone-book replacements, but provided message boards and finance stuff.



why I'm asking



I'm documenting navigation of applications designed with minimal concern for accessibility. My particular job seems to be describing how to navigate web applications for screen readers. Screen Readers (which read aloud text to blind/low-vision computer users) address everything in a pretty linear way. (Also, we have to keep all navigation keyboard-focused -- it's more predictable than a mouse.)



When windows pop-up, where did the focus go? Do the users know there's a new dialog on screen? Where does the focus go when the error message goes away? (To the last place it was, to the line with the error, or to the top of the page?) It's easy for the sighted to notice a missing field, the blinking cursor, or that something changed on the screen: but if the default were audible and tactile? How would the interfaces change?



What different communication elements may be emphasized? Would casual computers (like cell phones) do the same things or different ones?



I know answers could go in a steam-punk way, but they doesn't have to, or the proposed tech doesn't need to stay that way.










share|improve this question











$endgroup$




TL;DR



Let's say that computers were invented at a school for the blind in mid-1800s. How would today's technology, based on these non-screen-based computers, be different?



ETA: To clarify and hopefully narrow this enough -- I'm assuming that both the newer products will be influenced by functional computers predating movies & television: radio & telegraph/telephone may be more of the communication models. Also, that just like numpads on phones and computer keyboards are arranged differently due to vestigial bits from their separate origins, and our "Save" icon may confuse those who hadn't grown up with 3.5" floppies (I'm from the 5.25" era myself - Apple //c!), and we still call that thing in a car a "glove compartment" despite not wearing specific driving clothing any more).



So while sighted potential users greatly outnumber the blind ones, they're from a world where computers have always been fully accessible to the blind (so accessibility is not an afterthought), and that has probably driven the development of the CS field for quite a while.



Background elements



Braille had already been invented by the early 19th century, and it was derived from a military application (Night Writing, for Napolean's army) -- much like our computers (stored programs, some of the more theoretical elements were codified during WWII ) https://en.wikipedia.org/wiki/Braille



Punch Cards for Weaving had been invented in 1803 -- for a while schools for the blind were often trade schools-- the first one (https://en.wikipedia.org/wiki/Institut_National_des_Jeunes_Aveugles (first school for the blind was also named "National Institute of the working blinds", and was famous for graduating Organists.)



So now let's say they got an early Jacquard Loom head type machine (instead of Organs)
from https://en.wikipedia.org/wiki/Jacquard_loom#Importance_in_computing




... The ability to change the pattern of the loom's weave by simply changing cards was an important conceptual precursor to the development of computer programming and data entry. Charles Babbage knew of Jacquard looms and planned to use cards to store programs in his Analytical Engine. In the late 19th century, Herman Hollerith took the idea of using punched cards to store information a step further when he created a punched card tabulating machine which he used to input data for the 1890 U.S. Census.




(Note that this Loom appears to also be a French invention.)



Charles Babbage & Analytical Engine - according to Wikipedia (sorry that I keep going back to that source, but I'm assembling fragments of things I thought I knew or picked up (I'm no tech historian), and Wikipedia's the easiest place to assemble the threads.) -- he was self-taught from reading many mathematicians, some of which were French, and was definitely fighting the British Establishment.



from https://en.wikipedia.org/wiki/Charles_Babbage#Computing_pioneer




While Babbage's machines were mechanical and unwieldy, their basic architecture was similar to a modern computer. The data and program memory were separated, operation was instruction-based, the control unit could make conditional jumps, and the machine had a separate I/O unit




So a computer doesn't need to be print-derivative



We have punched cards (tangible, non-alphabetic) manipulating rules and representations of numbers. As Ada Lovelace said:




"We may say most aptly that the Analytical Engine weaves algebraical patterns just as the Jacquard loom weaves flowers and leaves."




So what if punched cards went in, braille results came out? Things may have stayed mechanical longer, instead of moving to processors as we know them, but there'd also be almost a century's extra progress. "Screens" may have moved to Refreshable Braille Displays - but would there be "windows" and other simultaneous processing?



The "World Building" or AltHistory part --



Just like the Internet was very US-focused in the beginning, so it has some legacy effects on domain names and rules, perhaps in this world, the computer world (and thus internet?) were dominated by French research, and blind computer scientists. Look at the Minitel for an example of France being way ahead of the curve! They started as phone-book replacements, but provided message boards and finance stuff.



why I'm asking



I'm documenting navigation of applications designed with minimal concern for accessibility. My particular job seems to be describing how to navigate web applications for screen readers. Screen Readers (which read aloud text to blind/low-vision computer users) address everything in a pretty linear way. (Also, we have to keep all navigation keyboard-focused -- it's more predictable than a mouse.)



When windows pop-up, where did the focus go? Do the users know there's a new dialog on screen? Where does the focus go when the error message goes away? (To the last place it was, to the line with the error, or to the top of the page?) It's easy for the sighted to notice a missing field, the blinking cursor, or that something changed on the screen: but if the default were audible and tactile? How would the interfaces change?



What different communication elements may be emphasized? Would casual computers (like cell phones) do the same things or different ones?



I know answers could go in a steam-punk way, but they doesn't have to, or the proposed tech doesn't need to stay that way.







technological-development communication computers senses






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited 4 hours ago







April

















asked 9 hours ago









AprilApril

1846 bronze badges




1846 bronze badges










  • 2




    $begingroup$
    Seems like we would all learn braille in school, and handwashing before using a keyboard/braille-output would be socially enforced. Braille is not a fast way to transmit information, so brevity would be valued over format, and formatting must add value or context to the message. It's a whole different way of thinking about communication.
    $endgroup$
    – user535733
    7 hours ago







  • 1




    $begingroup$
    "When windows pop-up, where did the focus go?" The focus went where it went. The screen reader should not try to guess; it should of course ask the underlying windowing system which window has the focus. "Do the users know there's a new dialog on screen?" Usually, but definitely not always. I have typed inappropriate input in the wrong window many many times.
    $endgroup$
    – AlexP
    6 hours ago










  • $begingroup$
    For the sighted, the focus is normally obvious. When programmers don't make the next focus clear, it can be confusing. That's a lot of my job: explaining where the focus probably is, or how to tell where it is, and if a pop-up needs to be listened to or can be dismissed (and which shortcut keys will dismiss it.)
    $endgroup$
    – April
    5 hours ago










  • $begingroup$
    You've given a good deal of information to reflect on for people to give an answer, but if it's going to become a mass-market product, it'll develop the way the market dictates. Without clearly defining the way the market works in your world (ie. at it's most basic level is it demand-lead or supply lead and to what degree), and defining patent law and the likely outcome in legal systems (global) of stealing tech (Apple got away with lots of "adaptations" of the Unix OS), this question becomes too broad and opinion based. To give you time to edit, voting to put on hold as too broad.
    $endgroup$
    – Chickens are not cows
    5 hours ago






  • 1




    $begingroup$
    "For the sighted, the focus is normally obvious": that used to be the case, but then Windows 10 came and now it isn't all that obvious... Sometimes the window with focus has a colored titlebar, sometimes it doesn't, sometimes it doesn't have a titlebar at all. I've learned to anticipate where the focus is and always click pre-emptively in the window which I think has the focus to confirm that the text insertion cursor is there.
    $endgroup$
    – AlexP
    4 hours ago













  • 2




    $begingroup$
    Seems like we would all learn braille in school, and handwashing before using a keyboard/braille-output would be socially enforced. Braille is not a fast way to transmit information, so brevity would be valued over format, and formatting must add value or context to the message. It's a whole different way of thinking about communication.
    $endgroup$
    – user535733
    7 hours ago







  • 1




    $begingroup$
    "When windows pop-up, where did the focus go?" The focus went where it went. The screen reader should not try to guess; it should of course ask the underlying windowing system which window has the focus. "Do the users know there's a new dialog on screen?" Usually, but definitely not always. I have typed inappropriate input in the wrong window many many times.
    $endgroup$
    – AlexP
    6 hours ago










  • $begingroup$
    For the sighted, the focus is normally obvious. When programmers don't make the next focus clear, it can be confusing. That's a lot of my job: explaining where the focus probably is, or how to tell where it is, and if a pop-up needs to be listened to or can be dismissed (and which shortcut keys will dismiss it.)
    $endgroup$
    – April
    5 hours ago










  • $begingroup$
    You've given a good deal of information to reflect on for people to give an answer, but if it's going to become a mass-market product, it'll develop the way the market dictates. Without clearly defining the way the market works in your world (ie. at it's most basic level is it demand-lead or supply lead and to what degree), and defining patent law and the likely outcome in legal systems (global) of stealing tech (Apple got away with lots of "adaptations" of the Unix OS), this question becomes too broad and opinion based. To give you time to edit, voting to put on hold as too broad.
    $endgroup$
    – Chickens are not cows
    5 hours ago






  • 1




    $begingroup$
    "For the sighted, the focus is normally obvious": that used to be the case, but then Windows 10 came and now it isn't all that obvious... Sometimes the window with focus has a colored titlebar, sometimes it doesn't, sometimes it doesn't have a titlebar at all. I've learned to anticipate where the focus is and always click pre-emptively in the window which I think has the focus to confirm that the text insertion cursor is there.
    $endgroup$
    – AlexP
    4 hours ago








2




2




$begingroup$
Seems like we would all learn braille in school, and handwashing before using a keyboard/braille-output would be socially enforced. Braille is not a fast way to transmit information, so brevity would be valued over format, and formatting must add value or context to the message. It's a whole different way of thinking about communication.
$endgroup$
– user535733
7 hours ago





$begingroup$
Seems like we would all learn braille in school, and handwashing before using a keyboard/braille-output would be socially enforced. Braille is not a fast way to transmit information, so brevity would be valued over format, and formatting must add value or context to the message. It's a whole different way of thinking about communication.
$endgroup$
– user535733
7 hours ago





1




1




$begingroup$
"When windows pop-up, where did the focus go?" The focus went where it went. The screen reader should not try to guess; it should of course ask the underlying windowing system which window has the focus. "Do the users know there's a new dialog on screen?" Usually, but definitely not always. I have typed inappropriate input in the wrong window many many times.
$endgroup$
– AlexP
6 hours ago




$begingroup$
"When windows pop-up, where did the focus go?" The focus went where it went. The screen reader should not try to guess; it should of course ask the underlying windowing system which window has the focus. "Do the users know there's a new dialog on screen?" Usually, but definitely not always. I have typed inappropriate input in the wrong window many many times.
$endgroup$
– AlexP
6 hours ago












$begingroup$
For the sighted, the focus is normally obvious. When programmers don't make the next focus clear, it can be confusing. That's a lot of my job: explaining where the focus probably is, or how to tell where it is, and if a pop-up needs to be listened to or can be dismissed (and which shortcut keys will dismiss it.)
$endgroup$
– April
5 hours ago




$begingroup$
For the sighted, the focus is normally obvious. When programmers don't make the next focus clear, it can be confusing. That's a lot of my job: explaining where the focus probably is, or how to tell where it is, and if a pop-up needs to be listened to or can be dismissed (and which shortcut keys will dismiss it.)
$endgroup$
– April
5 hours ago












$begingroup$
You've given a good deal of information to reflect on for people to give an answer, but if it's going to become a mass-market product, it'll develop the way the market dictates. Without clearly defining the way the market works in your world (ie. at it's most basic level is it demand-lead or supply lead and to what degree), and defining patent law and the likely outcome in legal systems (global) of stealing tech (Apple got away with lots of "adaptations" of the Unix OS), this question becomes too broad and opinion based. To give you time to edit, voting to put on hold as too broad.
$endgroup$
– Chickens are not cows
5 hours ago




$begingroup$
You've given a good deal of information to reflect on for people to give an answer, but if it's going to become a mass-market product, it'll develop the way the market dictates. Without clearly defining the way the market works in your world (ie. at it's most basic level is it demand-lead or supply lead and to what degree), and defining patent law and the likely outcome in legal systems (global) of stealing tech (Apple got away with lots of "adaptations" of the Unix OS), this question becomes too broad and opinion based. To give you time to edit, voting to put on hold as too broad.
$endgroup$
– Chickens are not cows
5 hours ago




1




1




$begingroup$
"For the sighted, the focus is normally obvious": that used to be the case, but then Windows 10 came and now it isn't all that obvious... Sometimes the window with focus has a colored titlebar, sometimes it doesn't, sometimes it doesn't have a titlebar at all. I've learned to anticipate where the focus is and always click pre-emptively in the window which I think has the focus to confirm that the text insertion cursor is there.
$endgroup$
– AlexP
4 hours ago





$begingroup$
"For the sighted, the focus is normally obvious": that used to be the case, but then Windows 10 came and now it isn't all that obvious... Sometimes the window with focus has a colored titlebar, sometimes it doesn't, sometimes it doesn't have a titlebar at all. I've learned to anticipate where the focus is and always click pre-emptively in the window which I think has the focus to confirm that the text insertion cursor is there.
$endgroup$
– AlexP
4 hours ago











4 Answers
4






active

oldest

votes


















6













$begingroup$

Youngsters. The first computers read and wrote punched cards or punched paper tape; they did not have any kind of user interface where being blind or sighted mattered.



It was perceived as major revolution when some smart technician adapted a typewriter to be able to print computer output; electric teletypewriters were then adapted so that operators could type commands into the computer. But teletypewriters are still purely linear devices.



Up until the late 1960s or early 1970s most users did not even see the computer or come anywhere near it. One wrote a program on a special form, the nice ladies in the card punch room converted it to punched cards, the cards were given to an operator through a wicket, and a note was made in a register; one day later one queued to receive the cards back, together with whatever output the program had produced, printed on 132-column fan-folded paper.



(Ever wondered why terminal emulators have options for 80 or 132 characters per line? That's why. A punch card could hold 80 characters, and was assimilated to one line of input. One line of computer printout had 132 characters. Those numbers were burned in the collective memory of informaticians.)



Up to this day operating systems in the Unix lineage are ready to interact with the user via a dumb terminal, with no graphics and no full-screen character cell capabilities.



The conclusion is that it doesn't matter where the first computers were made. It doesn't matter whether the inventor and the first users were blind or sighted. The first computer terminals which had the ability to run full-screen cursor-addressable character-cell interfaces (not graphics, just a rectangular array of characters) became available in the mid-1970s; that is, a staggering 25 years after the introduction of the UNIVAC I, the first commercially available programmable computer, and 30 years after the first well-known programmable computer, ENIAC, became operational in production for the U.S. Army. A full human generation separates the first computers from the first user interfaces where being sighted was necessarily an advantage.






share|improve this answer











$endgroup$














  • $begingroup$
    But this X years after Univac skips that movies and televisions also became major means of sharing information in the meantime -- screens became something people were used to. What if this were all 50 years BEFORE TV, so we're more in a telegraph/telephone/radio world.
    $endgroup$
    – April
    5 hours ago










  • $begingroup$
    @April: Computer screens became a thing quite late. For decades computer user interfaces, if they even had a user interface, were centered around teletypewriters, most usually Teletype Model 33 or (in the glorious socialist world) clones of it. When screen-based terminals first appeared (in the late 1960s, early 1970s) they had no extra functionality, and were even called "glass teletypewriters". Basically, as soon as the general tech level allows for full-screen user interfaces, computers will get full-screen user interfaces.
    $endgroup$
    – AlexP
    4 hours ago


















5













$begingroup$

I see no differences in how computer would have developed.



The first computers used punched cards to take input and give output (one of the favorite prank among nerds in those days was to swap two random cards in the physical folder containing them, when the owner was not paying attention), and graphics came much later.



And the reason is that when you move to mass usage, you have to rely on something fitting the masses. Punch cards aren't. Braille isn't, except for those who have to learn it. But we as species use sight as main mean of communication, so it is inevitable the usage of graphics.






share|improve this answer











$endgroup$










  • 1




    $begingroup$
    I'm proposing a bit of an AU -- given that a core of blind pre-computer scientists are focusing on the problem, and we're dealing with legacies of that system (just like area codes have legacies of the dial system, with the most populous-at-the-time regions having the easiest-to-physically-dial codes, or like our computer keypads have 9 in the upper right, but phones have 9 in lower right....
    $endgroup$
    – April
    8 hours ago










  • $begingroup$
    That's true from the data processing point of view, but not true from operational perspective. Even the earliest computers had a lot of lights on their control consoles. And as with any visual consoles, all those lights had to be perceived by the operator at once. To make it usable for blind operators, consoles and panels would need significant redesign.
    $endgroup$
    – Alexander
    7 hours ago










  • $begingroup$
    @Alexander: Operators were not users. Operators were technicians who ran the computer for the users. I know, I was an user; the operators were god-like. Users never even came anywhere near the computer.
    $endgroup$
    – AlexP
    6 hours ago










  • $begingroup$
    @AlexP do you suggest that in this scenario operators must be sighted?
    $endgroup$
    – Alexander
    5 hours ago






  • 1




    $begingroup$
    Not necessarily. The blinkenlights could have been replaced with tactile buttons/bumps with no loss of functionality. It's not as if anybody was ever expected to react in real-time to them; their only use was in diagnostic mode, and in that mode they were stable so that there was no need to see them all at the same time -- a blind operator could feel them with the fingers.
    $endgroup$
    – AlexP
    5 hours ago



















4













$begingroup$

I think the biggest difference would be in the development of user interfaces.



If computers had been designed primarily by and for blind users, I imagine a much more sophisticated version of the Refreshable Braille Display would be in common use by now. I'm imagining a grid of keys instead of a single row forming a kind of tactile screen. This would allow for parallel processes happening in different zones on the grid, like windows. Users could tap in a particular zone to get an audio readout of that process, to advance the readout, or to drop the cursor and start typing; much like modern haptic screens, different touches could indicate different actions. An audio cue could alert users of a pop-up alert, which would always appear in a designated zone. Afterwards, the user could return their hands to whatever process they wanted. Audio cues could also alert users to things like empty fields; if the grid is labeled like a battleship board, then an alert like "Input required in Zone M6" could be used to direct the user.



If blind people continued to be the primary developers of computers past the initial stages, advances in tactile interfaces would probably have replaced the advances in graphics. A tactile screen, like the one described above, would be a mechanical marvel, but wouldn't require much processing power to run; certainly nothing like playing a video. So the push for more and more powerful processors wouldn't have been as great. The tactile screen might be able to produce static images, by pushing pins up to form the outline of a shape, but probably most entertainment on computers would be in audio form. The podcast boom would have come much sooner, probably replacing the YouTube boom.



I hope that helps!






share|improve this answer









$endgroup$














  • $begingroup$
    I love it! that's part of what I was curious about. So we may not have had a moore's law style of increase, since processing power wouldn't be the biggest limiter?
    $endgroup$
    – April
    5 hours ago










  • $begingroup$
    I believe that would be true, though I'd be open to arguments to the contrary. With a tactile interface, you would be limited by the human ability to process audio and haptic information. So, once you had the ability to run the tactile screen and produce crisp sound, the push to improve processing power would not be commercially urgent. Only in the last few years, where the need to process large data sets has become critical, would there be a real need for powerful, commercially available processors.
    $endgroup$
    – IAntoniazzi
    4 hours ago


















2













$begingroup$

I think a good technology to consider in comparison is the telegraph. The telegraph also began as a technology processing bits of information that while accessible, in that they used the sound/touch of tapping, was also cumbersome to use in that it required the user to learn a specialized code to both input and interpret. So, you had a specialized profession develop around the telegraph, which gave way when an interface easier for the layman (the telephone) was developed. So, if you have the adoption of punch card computers a good century before Cathode Ray Tubes were sophisticated enough to create purely visual displays, you need to think about how you would stop CRTs from overwhelming punch cards and Refreshable Braille Displays.



Keeping telegraphs in mind, one interesting possibility is if your early punch card computers could interface directly with telegraph lines. The French were already pioneers at long distance communication: under Napoleon, signal towers were built connecting Paris to the frontiers of the country. What if a series of punch cards at a central computer in Paris could be sent to a punch card writer in Marseilles almost instantly? You could have a sort of internet under Napoleon III!



Ultimately, though, I think tactile interfaces are going to be hard to catch on very widely, even with these boosts. The best bet is to try to stimulate a jump to more audio displays. This is where a lot of technology is trying to move now: a natural language interface, like Siri or Alexa. Maybe if a punch card internet develops, you'd still have specialized data entry types for input, but the displays would instead become temporary phonographs?



Honestly, there are a lot of repercussions that could come from this, but good luck exploring! Some other resources to look at are 'Jacquard's Web' by James Essinger, a non-fiction book on the development of the Jacquard loom and some of its significance, and 'The Difference Engine' by William Gibson, which not only launched the steampunk genre but also deals with Babbage-style central computers as the major point of departure for the world.






share|improve this answer









$endgroup$










  • 1




    $begingroup$
    I like this -- I was sort of picturing a Napoleon + Discworld Clacks system. I do think since we're a while before movies and tv, but this is close to the telephone/radio age, we could end up with Siri/Alexa before 8bit graphics! (I read Difference Engine ages ago, and a bunch of Neal Stephenson, but I barely remember it now. I'm not too into steampunk as a genre, due to the way it seems to skip over the colonialization issues, but I will look for the nonfiction recommendation!)
    $endgroup$
    – April
    5 hours ago













Your Answer








StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "579"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fworldbuilding.stackexchange.com%2fquestions%2f153584%2fnon-visual-computers-thoughts%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























4 Answers
4






active

oldest

votes








4 Answers
4






active

oldest

votes









active

oldest

votes






active

oldest

votes









6













$begingroup$

Youngsters. The first computers read and wrote punched cards or punched paper tape; they did not have any kind of user interface where being blind or sighted mattered.



It was perceived as major revolution when some smart technician adapted a typewriter to be able to print computer output; electric teletypewriters were then adapted so that operators could type commands into the computer. But teletypewriters are still purely linear devices.



Up until the late 1960s or early 1970s most users did not even see the computer or come anywhere near it. One wrote a program on a special form, the nice ladies in the card punch room converted it to punched cards, the cards were given to an operator through a wicket, and a note was made in a register; one day later one queued to receive the cards back, together with whatever output the program had produced, printed on 132-column fan-folded paper.



(Ever wondered why terminal emulators have options for 80 or 132 characters per line? That's why. A punch card could hold 80 characters, and was assimilated to one line of input. One line of computer printout had 132 characters. Those numbers were burned in the collective memory of informaticians.)



Up to this day operating systems in the Unix lineage are ready to interact with the user via a dumb terminal, with no graphics and no full-screen character cell capabilities.



The conclusion is that it doesn't matter where the first computers were made. It doesn't matter whether the inventor and the first users were blind or sighted. The first computer terminals which had the ability to run full-screen cursor-addressable character-cell interfaces (not graphics, just a rectangular array of characters) became available in the mid-1970s; that is, a staggering 25 years after the introduction of the UNIVAC I, the first commercially available programmable computer, and 30 years after the first well-known programmable computer, ENIAC, became operational in production for the U.S. Army. A full human generation separates the first computers from the first user interfaces where being sighted was necessarily an advantage.






share|improve this answer











$endgroup$














  • $begingroup$
    But this X years after Univac skips that movies and televisions also became major means of sharing information in the meantime -- screens became something people were used to. What if this were all 50 years BEFORE TV, so we're more in a telegraph/telephone/radio world.
    $endgroup$
    – April
    5 hours ago










  • $begingroup$
    @April: Computer screens became a thing quite late. For decades computer user interfaces, if they even had a user interface, were centered around teletypewriters, most usually Teletype Model 33 or (in the glorious socialist world) clones of it. When screen-based terminals first appeared (in the late 1960s, early 1970s) they had no extra functionality, and were even called "glass teletypewriters". Basically, as soon as the general tech level allows for full-screen user interfaces, computers will get full-screen user interfaces.
    $endgroup$
    – AlexP
    4 hours ago















6













$begingroup$

Youngsters. The first computers read and wrote punched cards or punched paper tape; they did not have any kind of user interface where being blind or sighted mattered.



It was perceived as major revolution when some smart technician adapted a typewriter to be able to print computer output; electric teletypewriters were then adapted so that operators could type commands into the computer. But teletypewriters are still purely linear devices.



Up until the late 1960s or early 1970s most users did not even see the computer or come anywhere near it. One wrote a program on a special form, the nice ladies in the card punch room converted it to punched cards, the cards were given to an operator through a wicket, and a note was made in a register; one day later one queued to receive the cards back, together with whatever output the program had produced, printed on 132-column fan-folded paper.



(Ever wondered why terminal emulators have options for 80 or 132 characters per line? That's why. A punch card could hold 80 characters, and was assimilated to one line of input. One line of computer printout had 132 characters. Those numbers were burned in the collective memory of informaticians.)



Up to this day operating systems in the Unix lineage are ready to interact with the user via a dumb terminal, with no graphics and no full-screen character cell capabilities.



The conclusion is that it doesn't matter where the first computers were made. It doesn't matter whether the inventor and the first users were blind or sighted. The first computer terminals which had the ability to run full-screen cursor-addressable character-cell interfaces (not graphics, just a rectangular array of characters) became available in the mid-1970s; that is, a staggering 25 years after the introduction of the UNIVAC I, the first commercially available programmable computer, and 30 years after the first well-known programmable computer, ENIAC, became operational in production for the U.S. Army. A full human generation separates the first computers from the first user interfaces where being sighted was necessarily an advantage.






share|improve this answer











$endgroup$














  • $begingroup$
    But this X years after Univac skips that movies and televisions also became major means of sharing information in the meantime -- screens became something people were used to. What if this were all 50 years BEFORE TV, so we're more in a telegraph/telephone/radio world.
    $endgroup$
    – April
    5 hours ago










  • $begingroup$
    @April: Computer screens became a thing quite late. For decades computer user interfaces, if they even had a user interface, were centered around teletypewriters, most usually Teletype Model 33 or (in the glorious socialist world) clones of it. When screen-based terminals first appeared (in the late 1960s, early 1970s) they had no extra functionality, and were even called "glass teletypewriters". Basically, as soon as the general tech level allows for full-screen user interfaces, computers will get full-screen user interfaces.
    $endgroup$
    – AlexP
    4 hours ago













6














6










6







$begingroup$

Youngsters. The first computers read and wrote punched cards or punched paper tape; they did not have any kind of user interface where being blind or sighted mattered.



It was perceived as major revolution when some smart technician adapted a typewriter to be able to print computer output; electric teletypewriters were then adapted so that operators could type commands into the computer. But teletypewriters are still purely linear devices.



Up until the late 1960s or early 1970s most users did not even see the computer or come anywhere near it. One wrote a program on a special form, the nice ladies in the card punch room converted it to punched cards, the cards were given to an operator through a wicket, and a note was made in a register; one day later one queued to receive the cards back, together with whatever output the program had produced, printed on 132-column fan-folded paper.



(Ever wondered why terminal emulators have options for 80 or 132 characters per line? That's why. A punch card could hold 80 characters, and was assimilated to one line of input. One line of computer printout had 132 characters. Those numbers were burned in the collective memory of informaticians.)



Up to this day operating systems in the Unix lineage are ready to interact with the user via a dumb terminal, with no graphics and no full-screen character cell capabilities.



The conclusion is that it doesn't matter where the first computers were made. It doesn't matter whether the inventor and the first users were blind or sighted. The first computer terminals which had the ability to run full-screen cursor-addressable character-cell interfaces (not graphics, just a rectangular array of characters) became available in the mid-1970s; that is, a staggering 25 years after the introduction of the UNIVAC I, the first commercially available programmable computer, and 30 years after the first well-known programmable computer, ENIAC, became operational in production for the U.S. Army. A full human generation separates the first computers from the first user interfaces where being sighted was necessarily an advantage.






share|improve this answer











$endgroup$



Youngsters. The first computers read and wrote punched cards or punched paper tape; they did not have any kind of user interface where being blind or sighted mattered.



It was perceived as major revolution when some smart technician adapted a typewriter to be able to print computer output; electric teletypewriters were then adapted so that operators could type commands into the computer. But teletypewriters are still purely linear devices.



Up until the late 1960s or early 1970s most users did not even see the computer or come anywhere near it. One wrote a program on a special form, the nice ladies in the card punch room converted it to punched cards, the cards were given to an operator through a wicket, and a note was made in a register; one day later one queued to receive the cards back, together with whatever output the program had produced, printed on 132-column fan-folded paper.



(Ever wondered why terminal emulators have options for 80 or 132 characters per line? That's why. A punch card could hold 80 characters, and was assimilated to one line of input. One line of computer printout had 132 characters. Those numbers were burned in the collective memory of informaticians.)



Up to this day operating systems in the Unix lineage are ready to interact with the user via a dumb terminal, with no graphics and no full-screen character cell capabilities.



The conclusion is that it doesn't matter where the first computers were made. It doesn't matter whether the inventor and the first users were blind or sighted. The first computer terminals which had the ability to run full-screen cursor-addressable character-cell interfaces (not graphics, just a rectangular array of characters) became available in the mid-1970s; that is, a staggering 25 years after the introduction of the UNIVAC I, the first commercially available programmable computer, and 30 years after the first well-known programmable computer, ENIAC, became operational in production for the U.S. Army. A full human generation separates the first computers from the first user interfaces where being sighted was necessarily an advantage.







share|improve this answer














share|improve this answer



share|improve this answer








edited 6 hours ago

























answered 6 hours ago









AlexPAlexP

46k9 gold badges106 silver badges182 bronze badges




46k9 gold badges106 silver badges182 bronze badges














  • $begingroup$
    But this X years after Univac skips that movies and televisions also became major means of sharing information in the meantime -- screens became something people were used to. What if this were all 50 years BEFORE TV, so we're more in a telegraph/telephone/radio world.
    $endgroup$
    – April
    5 hours ago










  • $begingroup$
    @April: Computer screens became a thing quite late. For decades computer user interfaces, if they even had a user interface, were centered around teletypewriters, most usually Teletype Model 33 or (in the glorious socialist world) clones of it. When screen-based terminals first appeared (in the late 1960s, early 1970s) they had no extra functionality, and were even called "glass teletypewriters". Basically, as soon as the general tech level allows for full-screen user interfaces, computers will get full-screen user interfaces.
    $endgroup$
    – AlexP
    4 hours ago
















  • $begingroup$
    But this X years after Univac skips that movies and televisions also became major means of sharing information in the meantime -- screens became something people were used to. What if this were all 50 years BEFORE TV, so we're more in a telegraph/telephone/radio world.
    $endgroup$
    – April
    5 hours ago










  • $begingroup$
    @April: Computer screens became a thing quite late. For decades computer user interfaces, if they even had a user interface, were centered around teletypewriters, most usually Teletype Model 33 or (in the glorious socialist world) clones of it. When screen-based terminals first appeared (in the late 1960s, early 1970s) they had no extra functionality, and were even called "glass teletypewriters". Basically, as soon as the general tech level allows for full-screen user interfaces, computers will get full-screen user interfaces.
    $endgroup$
    – AlexP
    4 hours ago















$begingroup$
But this X years after Univac skips that movies and televisions also became major means of sharing information in the meantime -- screens became something people were used to. What if this were all 50 years BEFORE TV, so we're more in a telegraph/telephone/radio world.
$endgroup$
– April
5 hours ago




$begingroup$
But this X years after Univac skips that movies and televisions also became major means of sharing information in the meantime -- screens became something people were used to. What if this were all 50 years BEFORE TV, so we're more in a telegraph/telephone/radio world.
$endgroup$
– April
5 hours ago












$begingroup$
@April: Computer screens became a thing quite late. For decades computer user interfaces, if they even had a user interface, were centered around teletypewriters, most usually Teletype Model 33 or (in the glorious socialist world) clones of it. When screen-based terminals first appeared (in the late 1960s, early 1970s) they had no extra functionality, and were even called "glass teletypewriters". Basically, as soon as the general tech level allows for full-screen user interfaces, computers will get full-screen user interfaces.
$endgroup$
– AlexP
4 hours ago




$begingroup$
@April: Computer screens became a thing quite late. For decades computer user interfaces, if they even had a user interface, were centered around teletypewriters, most usually Teletype Model 33 or (in the glorious socialist world) clones of it. When screen-based terminals first appeared (in the late 1960s, early 1970s) they had no extra functionality, and were even called "glass teletypewriters". Basically, as soon as the general tech level allows for full-screen user interfaces, computers will get full-screen user interfaces.
$endgroup$
– AlexP
4 hours ago













5













$begingroup$

I see no differences in how computer would have developed.



The first computers used punched cards to take input and give output (one of the favorite prank among nerds in those days was to swap two random cards in the physical folder containing them, when the owner was not paying attention), and graphics came much later.



And the reason is that when you move to mass usage, you have to rely on something fitting the masses. Punch cards aren't. Braille isn't, except for those who have to learn it. But we as species use sight as main mean of communication, so it is inevitable the usage of graphics.






share|improve this answer











$endgroup$










  • 1




    $begingroup$
    I'm proposing a bit of an AU -- given that a core of blind pre-computer scientists are focusing on the problem, and we're dealing with legacies of that system (just like area codes have legacies of the dial system, with the most populous-at-the-time regions having the easiest-to-physically-dial codes, or like our computer keypads have 9 in the upper right, but phones have 9 in lower right....
    $endgroup$
    – April
    8 hours ago










  • $begingroup$
    That's true from the data processing point of view, but not true from operational perspective. Even the earliest computers had a lot of lights on their control consoles. And as with any visual consoles, all those lights had to be perceived by the operator at once. To make it usable for blind operators, consoles and panels would need significant redesign.
    $endgroup$
    – Alexander
    7 hours ago










  • $begingroup$
    @Alexander: Operators were not users. Operators were technicians who ran the computer for the users. I know, I was an user; the operators were god-like. Users never even came anywhere near the computer.
    $endgroup$
    – AlexP
    6 hours ago










  • $begingroup$
    @AlexP do you suggest that in this scenario operators must be sighted?
    $endgroup$
    – Alexander
    5 hours ago






  • 1




    $begingroup$
    Not necessarily. The blinkenlights could have been replaced with tactile buttons/bumps with no loss of functionality. It's not as if anybody was ever expected to react in real-time to them; their only use was in diagnostic mode, and in that mode they were stable so that there was no need to see them all at the same time -- a blind operator could feel them with the fingers.
    $endgroup$
    – AlexP
    5 hours ago
















5













$begingroup$

I see no differences in how computer would have developed.



The first computers used punched cards to take input and give output (one of the favorite prank among nerds in those days was to swap two random cards in the physical folder containing them, when the owner was not paying attention), and graphics came much later.



And the reason is that when you move to mass usage, you have to rely on something fitting the masses. Punch cards aren't. Braille isn't, except for those who have to learn it. But we as species use sight as main mean of communication, so it is inevitable the usage of graphics.






share|improve this answer











$endgroup$










  • 1




    $begingroup$
    I'm proposing a bit of an AU -- given that a core of blind pre-computer scientists are focusing on the problem, and we're dealing with legacies of that system (just like area codes have legacies of the dial system, with the most populous-at-the-time regions having the easiest-to-physically-dial codes, or like our computer keypads have 9 in the upper right, but phones have 9 in lower right....
    $endgroup$
    – April
    8 hours ago










  • $begingroup$
    That's true from the data processing point of view, but not true from operational perspective. Even the earliest computers had a lot of lights on their control consoles. And as with any visual consoles, all those lights had to be perceived by the operator at once. To make it usable for blind operators, consoles and panels would need significant redesign.
    $endgroup$
    – Alexander
    7 hours ago










  • $begingroup$
    @Alexander: Operators were not users. Operators were technicians who ran the computer for the users. I know, I was an user; the operators were god-like. Users never even came anywhere near the computer.
    $endgroup$
    – AlexP
    6 hours ago










  • $begingroup$
    @AlexP do you suggest that in this scenario operators must be sighted?
    $endgroup$
    – Alexander
    5 hours ago






  • 1




    $begingroup$
    Not necessarily. The blinkenlights could have been replaced with tactile buttons/bumps with no loss of functionality. It's not as if anybody was ever expected to react in real-time to them; their only use was in diagnostic mode, and in that mode they were stable so that there was no need to see them all at the same time -- a blind operator could feel them with the fingers.
    $endgroup$
    – AlexP
    5 hours ago














5














5










5







$begingroup$

I see no differences in how computer would have developed.



The first computers used punched cards to take input and give output (one of the favorite prank among nerds in those days was to swap two random cards in the physical folder containing them, when the owner was not paying attention), and graphics came much later.



And the reason is that when you move to mass usage, you have to rely on something fitting the masses. Punch cards aren't. Braille isn't, except for those who have to learn it. But we as species use sight as main mean of communication, so it is inevitable the usage of graphics.






share|improve this answer











$endgroup$



I see no differences in how computer would have developed.



The first computers used punched cards to take input and give output (one of the favorite prank among nerds in those days was to swap two random cards in the physical folder containing them, when the owner was not paying attention), and graphics came much later.



And the reason is that when you move to mass usage, you have to rely on something fitting the masses. Punch cards aren't. Braille isn't, except for those who have to learn it. But we as species use sight as main mean of communication, so it is inevitable the usage of graphics.







share|improve this answer














share|improve this answer



share|improve this answer








edited 6 hours ago

























answered 8 hours ago









L.DutchL.Dutch

109k33 gold badges256 silver badges524 bronze badges




109k33 gold badges256 silver badges524 bronze badges










  • 1




    $begingroup$
    I'm proposing a bit of an AU -- given that a core of blind pre-computer scientists are focusing on the problem, and we're dealing with legacies of that system (just like area codes have legacies of the dial system, with the most populous-at-the-time regions having the easiest-to-physically-dial codes, or like our computer keypads have 9 in the upper right, but phones have 9 in lower right....
    $endgroup$
    – April
    8 hours ago










  • $begingroup$
    That's true from the data processing point of view, but not true from operational perspective. Even the earliest computers had a lot of lights on their control consoles. And as with any visual consoles, all those lights had to be perceived by the operator at once. To make it usable for blind operators, consoles and panels would need significant redesign.
    $endgroup$
    – Alexander
    7 hours ago










  • $begingroup$
    @Alexander: Operators were not users. Operators were technicians who ran the computer for the users. I know, I was an user; the operators were god-like. Users never even came anywhere near the computer.
    $endgroup$
    – AlexP
    6 hours ago










  • $begingroup$
    @AlexP do you suggest that in this scenario operators must be sighted?
    $endgroup$
    – Alexander
    5 hours ago






  • 1




    $begingroup$
    Not necessarily. The blinkenlights could have been replaced with tactile buttons/bumps with no loss of functionality. It's not as if anybody was ever expected to react in real-time to them; their only use was in diagnostic mode, and in that mode they were stable so that there was no need to see them all at the same time -- a blind operator could feel them with the fingers.
    $endgroup$
    – AlexP
    5 hours ago













  • 1




    $begingroup$
    I'm proposing a bit of an AU -- given that a core of blind pre-computer scientists are focusing on the problem, and we're dealing with legacies of that system (just like area codes have legacies of the dial system, with the most populous-at-the-time regions having the easiest-to-physically-dial codes, or like our computer keypads have 9 in the upper right, but phones have 9 in lower right....
    $endgroup$
    – April
    8 hours ago










  • $begingroup$
    That's true from the data processing point of view, but not true from operational perspective. Even the earliest computers had a lot of lights on their control consoles. And as with any visual consoles, all those lights had to be perceived by the operator at once. To make it usable for blind operators, consoles and panels would need significant redesign.
    $endgroup$
    – Alexander
    7 hours ago










  • $begingroup$
    @Alexander: Operators were not users. Operators were technicians who ran the computer for the users. I know, I was an user; the operators were god-like. Users never even came anywhere near the computer.
    $endgroup$
    – AlexP
    6 hours ago










  • $begingroup$
    @AlexP do you suggest that in this scenario operators must be sighted?
    $endgroup$
    – Alexander
    5 hours ago






  • 1




    $begingroup$
    Not necessarily. The blinkenlights could have been replaced with tactile buttons/bumps with no loss of functionality. It's not as if anybody was ever expected to react in real-time to them; their only use was in diagnostic mode, and in that mode they were stable so that there was no need to see them all at the same time -- a blind operator could feel them with the fingers.
    $endgroup$
    – AlexP
    5 hours ago








1




1




$begingroup$
I'm proposing a bit of an AU -- given that a core of blind pre-computer scientists are focusing on the problem, and we're dealing with legacies of that system (just like area codes have legacies of the dial system, with the most populous-at-the-time regions having the easiest-to-physically-dial codes, or like our computer keypads have 9 in the upper right, but phones have 9 in lower right....
$endgroup$
– April
8 hours ago




$begingroup$
I'm proposing a bit of an AU -- given that a core of blind pre-computer scientists are focusing on the problem, and we're dealing with legacies of that system (just like area codes have legacies of the dial system, with the most populous-at-the-time regions having the easiest-to-physically-dial codes, or like our computer keypads have 9 in the upper right, but phones have 9 in lower right....
$endgroup$
– April
8 hours ago












$begingroup$
That's true from the data processing point of view, but not true from operational perspective. Even the earliest computers had a lot of lights on their control consoles. And as with any visual consoles, all those lights had to be perceived by the operator at once. To make it usable for blind operators, consoles and panels would need significant redesign.
$endgroup$
– Alexander
7 hours ago




$begingroup$
That's true from the data processing point of view, but not true from operational perspective. Even the earliest computers had a lot of lights on their control consoles. And as with any visual consoles, all those lights had to be perceived by the operator at once. To make it usable for blind operators, consoles and panels would need significant redesign.
$endgroup$
– Alexander
7 hours ago












$begingroup$
@Alexander: Operators were not users. Operators were technicians who ran the computer for the users. I know, I was an user; the operators were god-like. Users never even came anywhere near the computer.
$endgroup$
– AlexP
6 hours ago




$begingroup$
@Alexander: Operators were not users. Operators were technicians who ran the computer for the users. I know, I was an user; the operators were god-like. Users never even came anywhere near the computer.
$endgroup$
– AlexP
6 hours ago












$begingroup$
@AlexP do you suggest that in this scenario operators must be sighted?
$endgroup$
– Alexander
5 hours ago




$begingroup$
@AlexP do you suggest that in this scenario operators must be sighted?
$endgroup$
– Alexander
5 hours ago




1




1




$begingroup$
Not necessarily. The blinkenlights could have been replaced with tactile buttons/bumps with no loss of functionality. It's not as if anybody was ever expected to react in real-time to them; their only use was in diagnostic mode, and in that mode they were stable so that there was no need to see them all at the same time -- a blind operator could feel them with the fingers.
$endgroup$
– AlexP
5 hours ago





$begingroup$
Not necessarily. The blinkenlights could have been replaced with tactile buttons/bumps with no loss of functionality. It's not as if anybody was ever expected to react in real-time to them; their only use was in diagnostic mode, and in that mode they were stable so that there was no need to see them all at the same time -- a blind operator could feel them with the fingers.
$endgroup$
– AlexP
5 hours ago












4













$begingroup$

I think the biggest difference would be in the development of user interfaces.



If computers had been designed primarily by and for blind users, I imagine a much more sophisticated version of the Refreshable Braille Display would be in common use by now. I'm imagining a grid of keys instead of a single row forming a kind of tactile screen. This would allow for parallel processes happening in different zones on the grid, like windows. Users could tap in a particular zone to get an audio readout of that process, to advance the readout, or to drop the cursor and start typing; much like modern haptic screens, different touches could indicate different actions. An audio cue could alert users of a pop-up alert, which would always appear in a designated zone. Afterwards, the user could return their hands to whatever process they wanted. Audio cues could also alert users to things like empty fields; if the grid is labeled like a battleship board, then an alert like "Input required in Zone M6" could be used to direct the user.



If blind people continued to be the primary developers of computers past the initial stages, advances in tactile interfaces would probably have replaced the advances in graphics. A tactile screen, like the one described above, would be a mechanical marvel, but wouldn't require much processing power to run; certainly nothing like playing a video. So the push for more and more powerful processors wouldn't have been as great. The tactile screen might be able to produce static images, by pushing pins up to form the outline of a shape, but probably most entertainment on computers would be in audio form. The podcast boom would have come much sooner, probably replacing the YouTube boom.



I hope that helps!






share|improve this answer









$endgroup$














  • $begingroup$
    I love it! that's part of what I was curious about. So we may not have had a moore's law style of increase, since processing power wouldn't be the biggest limiter?
    $endgroup$
    – April
    5 hours ago










  • $begingroup$
    I believe that would be true, though I'd be open to arguments to the contrary. With a tactile interface, you would be limited by the human ability to process audio and haptic information. So, once you had the ability to run the tactile screen and produce crisp sound, the push to improve processing power would not be commercially urgent. Only in the last few years, where the need to process large data sets has become critical, would there be a real need for powerful, commercially available processors.
    $endgroup$
    – IAntoniazzi
    4 hours ago















4













$begingroup$

I think the biggest difference would be in the development of user interfaces.



If computers had been designed primarily by and for blind users, I imagine a much more sophisticated version of the Refreshable Braille Display would be in common use by now. I'm imagining a grid of keys instead of a single row forming a kind of tactile screen. This would allow for parallel processes happening in different zones on the grid, like windows. Users could tap in a particular zone to get an audio readout of that process, to advance the readout, or to drop the cursor and start typing; much like modern haptic screens, different touches could indicate different actions. An audio cue could alert users of a pop-up alert, which would always appear in a designated zone. Afterwards, the user could return their hands to whatever process they wanted. Audio cues could also alert users to things like empty fields; if the grid is labeled like a battleship board, then an alert like "Input required in Zone M6" could be used to direct the user.



If blind people continued to be the primary developers of computers past the initial stages, advances in tactile interfaces would probably have replaced the advances in graphics. A tactile screen, like the one described above, would be a mechanical marvel, but wouldn't require much processing power to run; certainly nothing like playing a video. So the push for more and more powerful processors wouldn't have been as great. The tactile screen might be able to produce static images, by pushing pins up to form the outline of a shape, but probably most entertainment on computers would be in audio form. The podcast boom would have come much sooner, probably replacing the YouTube boom.



I hope that helps!






share|improve this answer









$endgroup$














  • $begingroup$
    I love it! that's part of what I was curious about. So we may not have had a moore's law style of increase, since processing power wouldn't be the biggest limiter?
    $endgroup$
    – April
    5 hours ago










  • $begingroup$
    I believe that would be true, though I'd be open to arguments to the contrary. With a tactile interface, you would be limited by the human ability to process audio and haptic information. So, once you had the ability to run the tactile screen and produce crisp sound, the push to improve processing power would not be commercially urgent. Only in the last few years, where the need to process large data sets has become critical, would there be a real need for powerful, commercially available processors.
    $endgroup$
    – IAntoniazzi
    4 hours ago













4














4










4







$begingroup$

I think the biggest difference would be in the development of user interfaces.



If computers had been designed primarily by and for blind users, I imagine a much more sophisticated version of the Refreshable Braille Display would be in common use by now. I'm imagining a grid of keys instead of a single row forming a kind of tactile screen. This would allow for parallel processes happening in different zones on the grid, like windows. Users could tap in a particular zone to get an audio readout of that process, to advance the readout, or to drop the cursor and start typing; much like modern haptic screens, different touches could indicate different actions. An audio cue could alert users of a pop-up alert, which would always appear in a designated zone. Afterwards, the user could return their hands to whatever process they wanted. Audio cues could also alert users to things like empty fields; if the grid is labeled like a battleship board, then an alert like "Input required in Zone M6" could be used to direct the user.



If blind people continued to be the primary developers of computers past the initial stages, advances in tactile interfaces would probably have replaced the advances in graphics. A tactile screen, like the one described above, would be a mechanical marvel, but wouldn't require much processing power to run; certainly nothing like playing a video. So the push for more and more powerful processors wouldn't have been as great. The tactile screen might be able to produce static images, by pushing pins up to form the outline of a shape, but probably most entertainment on computers would be in audio form. The podcast boom would have come much sooner, probably replacing the YouTube boom.



I hope that helps!






share|improve this answer









$endgroup$



I think the biggest difference would be in the development of user interfaces.



If computers had been designed primarily by and for blind users, I imagine a much more sophisticated version of the Refreshable Braille Display would be in common use by now. I'm imagining a grid of keys instead of a single row forming a kind of tactile screen. This would allow for parallel processes happening in different zones on the grid, like windows. Users could tap in a particular zone to get an audio readout of that process, to advance the readout, or to drop the cursor and start typing; much like modern haptic screens, different touches could indicate different actions. An audio cue could alert users of a pop-up alert, which would always appear in a designated zone. Afterwards, the user could return their hands to whatever process they wanted. Audio cues could also alert users to things like empty fields; if the grid is labeled like a battleship board, then an alert like "Input required in Zone M6" could be used to direct the user.



If blind people continued to be the primary developers of computers past the initial stages, advances in tactile interfaces would probably have replaced the advances in graphics. A tactile screen, like the one described above, would be a mechanical marvel, but wouldn't require much processing power to run; certainly nothing like playing a video. So the push for more and more powerful processors wouldn't have been as great. The tactile screen might be able to produce static images, by pushing pins up to form the outline of a shape, but probably most entertainment on computers would be in audio form. The podcast boom would have come much sooner, probably replacing the YouTube boom.



I hope that helps!







share|improve this answer












share|improve this answer



share|improve this answer










answered 7 hours ago









IAntoniazziIAntoniazzi

1,9121 gold badge4 silver badges10 bronze badges




1,9121 gold badge4 silver badges10 bronze badges














  • $begingroup$
    I love it! that's part of what I was curious about. So we may not have had a moore's law style of increase, since processing power wouldn't be the biggest limiter?
    $endgroup$
    – April
    5 hours ago










  • $begingroup$
    I believe that would be true, though I'd be open to arguments to the contrary. With a tactile interface, you would be limited by the human ability to process audio and haptic information. So, once you had the ability to run the tactile screen and produce crisp sound, the push to improve processing power would not be commercially urgent. Only in the last few years, where the need to process large data sets has become critical, would there be a real need for powerful, commercially available processors.
    $endgroup$
    – IAntoniazzi
    4 hours ago
















  • $begingroup$
    I love it! that's part of what I was curious about. So we may not have had a moore's law style of increase, since processing power wouldn't be the biggest limiter?
    $endgroup$
    – April
    5 hours ago










  • $begingroup$
    I believe that would be true, though I'd be open to arguments to the contrary. With a tactile interface, you would be limited by the human ability to process audio and haptic information. So, once you had the ability to run the tactile screen and produce crisp sound, the push to improve processing power would not be commercially urgent. Only in the last few years, where the need to process large data sets has become critical, would there be a real need for powerful, commercially available processors.
    $endgroup$
    – IAntoniazzi
    4 hours ago















$begingroup$
I love it! that's part of what I was curious about. So we may not have had a moore's law style of increase, since processing power wouldn't be the biggest limiter?
$endgroup$
– April
5 hours ago




$begingroup$
I love it! that's part of what I was curious about. So we may not have had a moore's law style of increase, since processing power wouldn't be the biggest limiter?
$endgroup$
– April
5 hours ago












$begingroup$
I believe that would be true, though I'd be open to arguments to the contrary. With a tactile interface, you would be limited by the human ability to process audio and haptic information. So, once you had the ability to run the tactile screen and produce crisp sound, the push to improve processing power would not be commercially urgent. Only in the last few years, where the need to process large data sets has become critical, would there be a real need for powerful, commercially available processors.
$endgroup$
– IAntoniazzi
4 hours ago




$begingroup$
I believe that would be true, though I'd be open to arguments to the contrary. With a tactile interface, you would be limited by the human ability to process audio and haptic information. So, once you had the ability to run the tactile screen and produce crisp sound, the push to improve processing power would not be commercially urgent. Only in the last few years, where the need to process large data sets has become critical, would there be a real need for powerful, commercially available processors.
$endgroup$
– IAntoniazzi
4 hours ago











2













$begingroup$

I think a good technology to consider in comparison is the telegraph. The telegraph also began as a technology processing bits of information that while accessible, in that they used the sound/touch of tapping, was also cumbersome to use in that it required the user to learn a specialized code to both input and interpret. So, you had a specialized profession develop around the telegraph, which gave way when an interface easier for the layman (the telephone) was developed. So, if you have the adoption of punch card computers a good century before Cathode Ray Tubes were sophisticated enough to create purely visual displays, you need to think about how you would stop CRTs from overwhelming punch cards and Refreshable Braille Displays.



Keeping telegraphs in mind, one interesting possibility is if your early punch card computers could interface directly with telegraph lines. The French were already pioneers at long distance communication: under Napoleon, signal towers were built connecting Paris to the frontiers of the country. What if a series of punch cards at a central computer in Paris could be sent to a punch card writer in Marseilles almost instantly? You could have a sort of internet under Napoleon III!



Ultimately, though, I think tactile interfaces are going to be hard to catch on very widely, even with these boosts. The best bet is to try to stimulate a jump to more audio displays. This is where a lot of technology is trying to move now: a natural language interface, like Siri or Alexa. Maybe if a punch card internet develops, you'd still have specialized data entry types for input, but the displays would instead become temporary phonographs?



Honestly, there are a lot of repercussions that could come from this, but good luck exploring! Some other resources to look at are 'Jacquard's Web' by James Essinger, a non-fiction book on the development of the Jacquard loom and some of its significance, and 'The Difference Engine' by William Gibson, which not only launched the steampunk genre but also deals with Babbage-style central computers as the major point of departure for the world.






share|improve this answer









$endgroup$










  • 1




    $begingroup$
    I like this -- I was sort of picturing a Napoleon + Discworld Clacks system. I do think since we're a while before movies and tv, but this is close to the telephone/radio age, we could end up with Siri/Alexa before 8bit graphics! (I read Difference Engine ages ago, and a bunch of Neal Stephenson, but I barely remember it now. I'm not too into steampunk as a genre, due to the way it seems to skip over the colonialization issues, but I will look for the nonfiction recommendation!)
    $endgroup$
    – April
    5 hours ago















2













$begingroup$

I think a good technology to consider in comparison is the telegraph. The telegraph also began as a technology processing bits of information that while accessible, in that they used the sound/touch of tapping, was also cumbersome to use in that it required the user to learn a specialized code to both input and interpret. So, you had a specialized profession develop around the telegraph, which gave way when an interface easier for the layman (the telephone) was developed. So, if you have the adoption of punch card computers a good century before Cathode Ray Tubes were sophisticated enough to create purely visual displays, you need to think about how you would stop CRTs from overwhelming punch cards and Refreshable Braille Displays.



Keeping telegraphs in mind, one interesting possibility is if your early punch card computers could interface directly with telegraph lines. The French were already pioneers at long distance communication: under Napoleon, signal towers were built connecting Paris to the frontiers of the country. What if a series of punch cards at a central computer in Paris could be sent to a punch card writer in Marseilles almost instantly? You could have a sort of internet under Napoleon III!



Ultimately, though, I think tactile interfaces are going to be hard to catch on very widely, even with these boosts. The best bet is to try to stimulate a jump to more audio displays. This is where a lot of technology is trying to move now: a natural language interface, like Siri or Alexa. Maybe if a punch card internet develops, you'd still have specialized data entry types for input, but the displays would instead become temporary phonographs?



Honestly, there are a lot of repercussions that could come from this, but good luck exploring! Some other resources to look at are 'Jacquard's Web' by James Essinger, a non-fiction book on the development of the Jacquard loom and some of its significance, and 'The Difference Engine' by William Gibson, which not only launched the steampunk genre but also deals with Babbage-style central computers as the major point of departure for the world.






share|improve this answer









$endgroup$










  • 1




    $begingroup$
    I like this -- I was sort of picturing a Napoleon + Discworld Clacks system. I do think since we're a while before movies and tv, but this is close to the telephone/radio age, we could end up with Siri/Alexa before 8bit graphics! (I read Difference Engine ages ago, and a bunch of Neal Stephenson, but I barely remember it now. I'm not too into steampunk as a genre, due to the way it seems to skip over the colonialization issues, but I will look for the nonfiction recommendation!)
    $endgroup$
    – April
    5 hours ago













2














2










2







$begingroup$

I think a good technology to consider in comparison is the telegraph. The telegraph also began as a technology processing bits of information that while accessible, in that they used the sound/touch of tapping, was also cumbersome to use in that it required the user to learn a specialized code to both input and interpret. So, you had a specialized profession develop around the telegraph, which gave way when an interface easier for the layman (the telephone) was developed. So, if you have the adoption of punch card computers a good century before Cathode Ray Tubes were sophisticated enough to create purely visual displays, you need to think about how you would stop CRTs from overwhelming punch cards and Refreshable Braille Displays.



Keeping telegraphs in mind, one interesting possibility is if your early punch card computers could interface directly with telegraph lines. The French were already pioneers at long distance communication: under Napoleon, signal towers were built connecting Paris to the frontiers of the country. What if a series of punch cards at a central computer in Paris could be sent to a punch card writer in Marseilles almost instantly? You could have a sort of internet under Napoleon III!



Ultimately, though, I think tactile interfaces are going to be hard to catch on very widely, even with these boosts. The best bet is to try to stimulate a jump to more audio displays. This is where a lot of technology is trying to move now: a natural language interface, like Siri or Alexa. Maybe if a punch card internet develops, you'd still have specialized data entry types for input, but the displays would instead become temporary phonographs?



Honestly, there are a lot of repercussions that could come from this, but good luck exploring! Some other resources to look at are 'Jacquard's Web' by James Essinger, a non-fiction book on the development of the Jacquard loom and some of its significance, and 'The Difference Engine' by William Gibson, which not only launched the steampunk genre but also deals with Babbage-style central computers as the major point of departure for the world.






share|improve this answer









$endgroup$



I think a good technology to consider in comparison is the telegraph. The telegraph also began as a technology processing bits of information that while accessible, in that they used the sound/touch of tapping, was also cumbersome to use in that it required the user to learn a specialized code to both input and interpret. So, you had a specialized profession develop around the telegraph, which gave way when an interface easier for the layman (the telephone) was developed. So, if you have the adoption of punch card computers a good century before Cathode Ray Tubes were sophisticated enough to create purely visual displays, you need to think about how you would stop CRTs from overwhelming punch cards and Refreshable Braille Displays.



Keeping telegraphs in mind, one interesting possibility is if your early punch card computers could interface directly with telegraph lines. The French were already pioneers at long distance communication: under Napoleon, signal towers were built connecting Paris to the frontiers of the country. What if a series of punch cards at a central computer in Paris could be sent to a punch card writer in Marseilles almost instantly? You could have a sort of internet under Napoleon III!



Ultimately, though, I think tactile interfaces are going to be hard to catch on very widely, even with these boosts. The best bet is to try to stimulate a jump to more audio displays. This is where a lot of technology is trying to move now: a natural language interface, like Siri or Alexa. Maybe if a punch card internet develops, you'd still have specialized data entry types for input, but the displays would instead become temporary phonographs?



Honestly, there are a lot of repercussions that could come from this, but good luck exploring! Some other resources to look at are 'Jacquard's Web' by James Essinger, a non-fiction book on the development of the Jacquard loom and some of its significance, and 'The Difference Engine' by William Gibson, which not only launched the steampunk genre but also deals with Babbage-style central computers as the major point of departure for the world.







share|improve this answer












share|improve this answer



share|improve this answer










answered 6 hours ago









TzeraFNXTzeraFNX

7603 silver badges14 bronze badges




7603 silver badges14 bronze badges










  • 1




    $begingroup$
    I like this -- I was sort of picturing a Napoleon + Discworld Clacks system. I do think since we're a while before movies and tv, but this is close to the telephone/radio age, we could end up with Siri/Alexa before 8bit graphics! (I read Difference Engine ages ago, and a bunch of Neal Stephenson, but I barely remember it now. I'm not too into steampunk as a genre, due to the way it seems to skip over the colonialization issues, but I will look for the nonfiction recommendation!)
    $endgroup$
    – April
    5 hours ago












  • 1




    $begingroup$
    I like this -- I was sort of picturing a Napoleon + Discworld Clacks system. I do think since we're a while before movies and tv, but this is close to the telephone/radio age, we could end up with Siri/Alexa before 8bit graphics! (I read Difference Engine ages ago, and a bunch of Neal Stephenson, but I barely remember it now. I'm not too into steampunk as a genre, due to the way it seems to skip over the colonialization issues, but I will look for the nonfiction recommendation!)
    $endgroup$
    – April
    5 hours ago







1




1




$begingroup$
I like this -- I was sort of picturing a Napoleon + Discworld Clacks system. I do think since we're a while before movies and tv, but this is close to the telephone/radio age, we could end up with Siri/Alexa before 8bit graphics! (I read Difference Engine ages ago, and a bunch of Neal Stephenson, but I barely remember it now. I'm not too into steampunk as a genre, due to the way it seems to skip over the colonialization issues, but I will look for the nonfiction recommendation!)
$endgroup$
– April
5 hours ago




$begingroup$
I like this -- I was sort of picturing a Napoleon + Discworld Clacks system. I do think since we're a while before movies and tv, but this is close to the telephone/radio age, we could end up with Siri/Alexa before 8bit graphics! (I read Difference Engine ages ago, and a bunch of Neal Stephenson, but I barely remember it now. I'm not too into steampunk as a genre, due to the way it seems to skip over the colonialization issues, but I will look for the nonfiction recommendation!)
$endgroup$
– April
5 hours ago

















draft saved

draft discarded
















































Thanks for contributing an answer to Worldbuilding Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fworldbuilding.stackexchange.com%2fquestions%2f153584%2fnon-visual-computers-thoughts%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Invision Community Contents History See also References External links Navigation menuProprietaryinvisioncommunity.comIPS Community ForumsIPS Community Forumsthis blog entry"License Changes, IP.Board 3.4, and the Future""Interview -- Matt Mecham of Ibforums""CEO Invision Power Board, Matt Mecham Is a Liar, Thief!"IPB License Explanation 1.3, 1.3.1, 2.0, and 2.1ArchivedSecurity Fixes, Updates And Enhancements For IPB 1.3.1Archived"New Demo Accounts - Invision Power Services"the original"New Default Skin"the original"Invision Power Board 3.0.0 and Applications Released"the original"Archived copy"the original"Perpetual licenses being done away with""Release Notes - Invision Power Services""Introducing: IPS Community Suite 4!"Invision Community Release Notes

Canceling a color specificationRandomly assigning color to Graphics3D objects?Default color for Filling in Mathematica 9Coloring specific elements of sets with a prime modified order in an array plotHow to pick a color differing significantly from the colors already in a given color list?Detection of the text colorColor numbers based on their valueCan color schemes for use with ColorData include opacity specification?My dynamic color schemes

Ласкавець круглолистий Зміст Опис | Поширення | Галерея | Примітки | Посилання | Навігаційне меню58171138361-22960890446Bupleurum rotundifoliumEuro+Med PlantbasePlants of the World Online — Kew ScienceGermplasm Resources Information Network (GRIN)Ласкавецькн. VI : Літери Ком — Левиправивши або дописавши її