An idea,

Janina Sajka janina at rednote.net
Wed Jul 27 11:33:41 EDT 2005


Good point, Scott. Far too difficult, imho.

All I can answer by way of "Why?" is to point out that somebody had to
decide. And, in this instance, that somebody got insufficient input from
the greater "us." Clearly, they blew it. Output like:

Layer 0
Layer 1
Layer 2

Is so o o o unhelpful, when it could easily be very helpful.

Scott Berry writes:
> This really doesn't pertain to how Gnopernicus works.  But one 
> question I really would like to through around out here is why are 
> the keystrokes so difficult.  I know you can change key bindings in 
> Gnopernicus but it looked like a guy with 20 years of experience 
> would have to do it.  Maybe this has changed in recent versions but I 
> remember when I started using Gnopernicus wow!  Difficult to learn.
> 
> 
> 
> At 08:05 AM 7/27/2005, you wrote:
> 
> >Hi, Lorenzo:
> >
> >Others have responded with reference to what an X server does, and
> >doesn't do. I want to respond to two other particular points from your
> >message.
> >
> >Lorenzo Taylor writes:
> >> -----BEGIN PGP SIGNED MESSAGE-----
> >> Hash: SHA1
> >>
> >>  ... Gnopernicus, for example, is using libraries that
> >> rely on certain information ent by the underlying application libraries.
> >> Unfortunately, this implementation causes only some apps to speak 
> >while others
> >> which use the same widgets but whose libraries don't send messages to the
> >> accessibility system will not speak.
> >
> >This is only partially correct. Any applications using those "same
> >widgets," as you put it, will speak. There are no exceptions.
> >
> >What causes them to not speak is that the properties required to make
> >them speak have not been supplied. So, Gnopernicus is getting an empty
> >string to renderd, which I suppose it dutifully renders as silence.
> >
> >Fortunately, these are open source applications and we don't need an
> >advocacy campaign to resolve these kinds of problems. A solid example of
> >this at work is the Gnome Volume Control. It was written with gtk2, but
> >the developers did not supply all the relevant property data. So, a
> >blind programmer came along one weekend, fixed it, and submitted the
> >patch which has shipped with the rest of Gnome Volume Control ever
> >since.
> >
> >Now the next point ...
> >
> >>  But it occurs to me that X is simply a
> >> protocol by which client applications send messages to a server 
> >which renders
> >> the proper text, windows, buttons and other widgets on the 
> >screen.  I believe
> >> that a screen reader that is an extension to the X server itself, 
> >(like Speakup
> >> is a set of patches to the kernel) would be a far better 
> >solution, as it could
> >> capture everything sent to the server and correctly translate it 
> >into humanly
> >> understandable speech output without relying on "accessibility 
> >messages" being
> >> sent from the client apps.
> >
> >
> >As other have pointed out, there's nothing to be gained by speaking RGB
> >values at some particular X-Y mouse coordinate location. But, I'm sure
> >that's not what you really intend. If I interpret you correctly you're
> >suggesting some kind of mechanism whereby a widget of some kind can be
> >reliably identified and assigned values that the screen reader can
> >henceforth utter. This is the approach with Windows OSM that has been
> >used over the past decade, and it's what allows screen readers, like
> >JFW, to develop interfaces based on scripts. For instance, Take widget
> >number 38,492 and call it "volume slider," and speak it before anything
> >else on screen when it shows up on screen, and facilitate the method
> >that will allow user to use up and down arrow to change it's value,
> >etc., etc.
> >
> >It is arguable, and has been cogently argued over the past 18 months,
> >that the failure of the original Desktop Accessibility Architecture
> >promoted by Sun and Gnome was to not provide such mechanisms. A great
> >part of the intent of the Orca screen reader proof of concept was to
> >provide exactly this kind of functionality. I believe this is now being
> >addressed, though I'm not aware any code for newer Gnopernicus (or post
> >Gnopernicus) readers is yet released. However, I do fully expect that
> >Gnopernicus is not the last word in desktop screen readers.
> >
> >                                Janina
> >
> >_______________________________________________
> >Speakup mailing list
> >Speakup at braille.uwo.ca
> >http://speech.braille.uwo.ca/mailman/listinfo/speakup
> 
> 
> _______________________________________________
> Speakup mailing list
> Speakup at braille.uwo.ca
> http://speech.braille.uwo.ca/mailman/listinfo/speakup

-- 

Janina Sajka				Phone: +1.202.494.7040
Partner, Capital Accessibility LLC	http://www.CapitalAccessibility.Com
Bringing the Owasys 22C screenless cell phone to the U.S. and Canada. Go to http://www.ScreenlessPhone.Com to learn more.

Chair, Accessibility Workgroup		Free Standards Group (FSG)
janina at freestandards.org		http://a11y.org




More information about the Speakup mailing list