An idea,
Kenny Hitt
kenny at hittsjunk.net
Wed Jul 27 00:33:03 EDT 2005
Hi.
At the Xserver level, you don't have any
information but draw this pixle using this comvination of rgb data.
This pixel should be drawn on this screen of this display.
The text info and control type are at the library level and not at the
Xserver level.
The best comparison I've come up with to MS windows is to think of the
Xserver as the drivers for video, keyboard, and mouse. Even the
implimentation and management of windows on the screen is handled by an
app that just talks to the Xserver.
If you want to see this, create an .xinitrc file that only starts the
Xserver and no apps. You will have a screen with a mouse pointer. You
can move the pointer, but that will be it. Since no window manager or
app will be running, you can only kill off the Xserver, change the
screen resolution, or move a useless mouse pointer around a screen.
BTW, the words "screen"and "display"have different and specific meanings
in X. My explanation above mixes terms to try to get across my point.
I've read some X HOWTOs that do a better job of explaining it, but I
think you get the idea. Since current distros set up X for you
automatically, you don't need to follow the instructions in an X HOWTO,
but they are worth reading for educational purposes.
To get the best results with Gnome, you need an app that uses gtk2, a
screen reader that understands the info sent by the gtk2 accessibility
structure (atspi), and a window manager that also sends window events to
atspi.
For KDE accessibility, substitute KDE for Gnome and qt4 for gtk2 in the
previous paragraph.
Hope this helps.
Kenny
On Tue, Jul 26, 2005 at 11:41:21PM -0400, Lorenzo Taylor wrote:
> Here's another idea, maybe no one has thought of it yet, or maybe it is
> impossible to implement, but here it goes.
>
> It seems that the existing approaches for X screen readers should be taking a
> look at Speakup as a model. Gnopernicus, for example, is using libraries that
> rely on certain information ent by the underlying application libraries.
> Unfortunately, this implementation causes only some apps to speak while others
> which use the same widgets but whose libraries don't send messages to the
> accessibility system will not speak. But it occurs to me that X is simply a
> protocol by which client applications send messages to a server which renders
> the proper text, windows, buttons and other widgets on the screen. I believe
> that a screen reader that is an extension to the X server itself, (like Speakup
> is a set of patches to the kernel) would be a far better solution, as it could
> capture everything sent to the server and correctly translate it into humanly
> understandable speech output without relying on "accessibility messages" being
> sent from the client apps.
>
> Any thoughts on this would be welcome.
>
> Lorenzo
> --
> -----BEGIN GEEK CODE BLOCK-----
> Version: 3.12
> GCS d- s:+ a- C+++ UL++++ P+ L+++ E- W++ N o K- w---
> O M V- PS+++ PE Y+ PGP++ t++ 5+ X+ R tv-- b++ DI-- D+
> G e* h---- r+++ y+++
> ------END GEEK CODE BLOCK------
>
> _______________________________________________
> Speakup mailing list
> Speakup at braille.uwo.ca
> http://speech.braille.uwo.ca/mailman/listinfo/speakup
More information about the Speakup
mailing list