As i mentioned earlier we now have a reliable device to emboss Braille
BrailleRAP is able to use any material sheet that fit in A4 size (210x297 mm).
BrailleRAP XL is able to use any material sheet that fit in A3 size (297x420mm).
As any material we have successfully tested paper, plastic, sticker, thin metal, postcard ... At least you can test anything that fit in the device.
Having a device without software to use it, is like having nothing. BrailleRAP understand GCODE, but you don't want to write GCODE to emboss Braille characters.
AccessBrailleRAP
AccessBrailleRAP, is a specialized software to translate text in in Braille. It can open existing text file and thanks to pandoc, you can also use open office documents and any file format readable by pandoc. The text is translated in Braille and paginated so you can choose the page you want to print. AccessBraillleRAP is NVDA compatible, and usable by unsighted peoples.
DesktopBrailleRAP
DesktopBrailleRAP, is a software to compose page presentation with Braille and vector graphics. You can use it to annotated in Braille a geographic map, a biology schematics, architectural drawing ...
Humanlab INRIA as even tested DesktopBrailleRAP with post treated ultrasound images.
So what's next. Of course improving existing softwares and devices, easier to build, more features and more accessible. Improving documentation is also a subject, makers from all over the world had successfully replicated the device so the building documentation is not so bad, but we can improve it.
What we have done in recents weeks is documenting workshop where BrailleRAP is not the goal but the subject. While writing the software for pattern features in DesktopBrailleRAP, i was wondering what is a good pattern to figure colors. We had already made some tests with patterns, and we already knew that filling figure with tangible pattern is a good way to have more rich tangibles documents. But we wanted to have a better understanding of the way unsighted people feel tangible documents and if using patterns is efficient to enhance this feeling. So with My Human Kit we design a workshop and run it with local unsighted non profit organisation. The workshop plan is available here https://github.com/braillerap/DesktopBrailleRAP/wiki/Pattern-filling-test-%E2%80%90-the-method and the run result is available here https://github.com/braillerap/DesktopBrailleRAP/wiki/pattern-test-workshop-2024-11-20. During the workshop, we learn a lot of about building tangible documents for unsighted people, what is usable, what is comfortable and of course what is not usable. Documenting a project is usual, but documenting the way we use the device is a good way to indicate what it can be used for.
Since the ending of 2023 we work hard to update and improve the project. As the Hackaday prize gave some light to the project, it also provided new development opportunities.
New Softwares
By the end of the 2023, there were only AccessBrailleRAP, a piece of software where you can enter some text, translate it in Braille and emboss your text on the BrailleRAP. Braille translation is provided by libLouis and you can use 200 Braille standards, to translate language from English to Chinese passing by Italian, Dutch ... Most used languages all over the world can now be translated in Braille in AccessBralleRAP. Thanks to the codeberg platform, the GUI of the software is also translated in some languages (Arabic, Chinese, Ukrainian, German ....)
The main piece of software we release in 2024 is DesktopBrailleRAP. Just imagine a piece of software where you can put some text labels and svg vector graphics. Text labels are translated in Braille, and vector graphics are processed to emboss series of dots on all the edge of the svg element. This tool give you the opportunity to build tangible document where Braille is associated with tactile graphics representation. So you can build geographic map, pedagogic documents about mathematics, physics, biology ....
We are still testing this new graphics feature of the project but it's very promising. We are currently working on filling figure with dots patterns to give the opportunity of representing different colors in drawing.
BrailleRAP XL
After some tests with DesktopBrailleRAP, it was obvious that A4 paper sheet was too small for some usage scenario. So we start thinking of a bigger design. Basically we took the A4 design, just enlarge the frame, add some paper rolls, and strengthen some part of the frame . And gotcha, we have now 2 design. In the smaller one, you can use anything to 210 x 297 mm material, in the BrailleRAP XL model, you can use anything to 297 x 420 mm.
Each one have some pro and cons. The historic design in A4 is cheap, lightweight and easily transportable in a reasonable size backpack or case. The XL design of course offer more features but it is also a little more expensive (around 450 $) and 50% heavier (around 7 kg). With a frame size of 470 x 270 x 150, i can't imagine putting it in a backpack for every day use in mobility.
Now that we have a working solution to enable a BrailleRAP as wifi access point, and giving access to BrailleRAP feature via a mobile web app. It's time to give a try to a more useful Braille Translator : LibLouis.
Liblouis is an open source Braille translator library : https://liblouis.io/. A nice library giving you the feature of translating text in Braille in 200 Braille standards, for many languages all over the world.
We've already done the work in AccessBrailleRAP, compiling all liblouis code and data files in a webassembly to build our react app with accessibility features (ie aria- html tags). But in AccessBrailleRAP the liblouis WebAssembly is about 12 Mo, no way to integrate that in a 4 Mb ESP 32.
The great idea at the bases of liblouis, is that the library itself is 'just' an interpreter for grammar file defining how you can translate a language into Braille dots patterns.
So we test selecting a subset of language (english, french, german, spanich, italian, portuguese) and build a webassembly with all the files needed for these languages:
After rebuilding the webassembly with the selected file the verdict is : 493 Ko ! this is much better than 12Mo.
We then upgrade our previous ESP32 firmware with our new Braille translator , just adding a combo box to select the Braille standard you want to use. And everything work fine, we now have up to date Braille translation available, allowing to use BrailleRAP embossing features from a single mobile phone.
Now that we have enabled a second UART on the MKS controller board of a BrailleRAP, it's time to make a real thing. Wiring an ESP32 on the controller board, can we enable a WIFI access point and diffuse a mobile web application to emboss some Braille with a BrailleRAP ?
Enabling a WIFI access point with an ESP32 is not a big deal, there is full of examples around the internet about that. Starting a web server serving a react web application seam faisable. So we start a project with VS Code and Arduino ecosystem and a small react application with an input field and a print button.
Starting a wifi access point is just a line of code
WiFi.softAP(ssid, password);
And starting a web server from files stored in SPIFFS seams easy
But it didn't work, SPIFFS in arduino environment is limited to 32 char for filename, this is not enough for a quick packaged react app. As we often do with lwip in microcontroller environment, we choose to embed all files in C data structures. And it fail again, all seem to work well with small files, but with a few kilobytes js file something go wrong and hang. So we choose to fall back to ESP32-idf, the original toolchain for ESP32 from expressif.
And after a few tests we had a working WIFI access point with a react based web captive portal.
For quickly translate the input text in Braille, we just copy paste our previous tbfr2007 translator written in javascript, this is not a "global" solution as there is nearly as many Braille standards as countries in the world, but this is good enough to give a try.
Wiring
As we plan to use UART2 of the ESP 32 to talk with the controller board, we select the standard IO16 and IO17 as TX and RX on ESP side. So we just need to wire TX from ESP to RX on MKS and RX from ESP on TX on MKS, and link the GND of the ESP to the GND of the controller board.
Sending the GCODE
There is several solution s to send the GCODE to a board with Marlin firmware, one of the simplest is just sending GCODE commands without comment and waiting for a "ok" or "error" answer from the board. The more important problem is that it seem not a good idea to send the complete GCODE command file from the react app (which run on the mobile phone) to the ESP, the ESP32 is a comfortable MCU with thousands of kilobyte available, but GCODE files can often take a few megabytes, even for Braille single page.
As we try to go fast, we first try sending GCODE command one at a time embedded in JSON POST requests to the ESP, it partially work but with poor performances.
So we decide to go websocket, when you click on print button, the react application open a websocket with the web server on ESP, then we send GCODE commands one a time to the ESP, the ESP then send the GCODE to the UART, wait for the controller answer and send the commande back to the react app via the websocket. We were able to print complete Braille page but with fair performances regarding what we achieve with AccessBrailleRAP connected to the controller with USB.
Final solution for the test
To improve performance, we implement a simple protocole with a fifo queue on the ESP32 side. The react app send GCODE commands one at a time, and the ESP answer immediately if there is space for more commands in the fifo. So the react app send all the GCODE as quick as it can, when the buffer is full, the react app wait for some space to send the next command. On the ESP side we start a FreeRTOS task waiting for GCODE commands available in the fifo, then sending it to the UART, wait for the controller answer, and send back a status to the react app. At the end of the print, we simple close the websocket. And it work just fine !
In the path to enable WIFI connectivity with BrailleRAP, we need an upgrade of the BrailleRAP firmware to Marlin 2 as standard BrailleRAP firmware is still based on Marlin V1.
We call the firmware we are using in BrailleRAP MarlinBraille. Basically it's a Marlin firmware with the good configuration for Homing, Step / mm and using the bed power connector to control the electromagnet. We modified the G28 command to set the 0 on Y axes at the top of the paper sheet, this is just a slightly logic modification of the homing function.
- If the endstop Y is activated there is already a sheet of paper in the device so we are doing a standard Homing procedure
- If the endstop Y is not activated, we try to move the paper sheet forward until the endstop is activated. Then we are doing a standard homing procedure.
Porting the MarlinBrailleRAP features in Marlin V2 was not a great deal, just about reporting the motor driver configuration and specialize the homing logic. Building Marlin 2 is another story if you want to build it for several MCU base board. And this is what we have in mind, building a firmware for MKS gen 1.4 and MKS Gen L V2.1 which are based on ATMEGA 2560 MCU, and build a firmware for MKS TinyBEE, an ESP32 base control board.
Marlin V2 build system is based on buildroot, a developpement tools, genuinely developed to compile linux kernel on different embedded hardware. You can still try Arduino ide or VS code, if you just need one configuration. But i you want to build for several MCU, you need an automated build tool, based on make or cmake, and the toolchain for every mcu you need. These kind of build environment can be easily done in docker. Docker is a software used to build virtual machine, you can use it to build software configuration like web or database server available on internet. You can also use docker to build temporary "machine" just to install complex software toolchain, compile what you need, and destroy the virtual machine. With this kind of procedure, you avoid "polluting" your own system with complex software installation.
For those who are interested you will find all source code on github:
Now that we have a firmware based on Marlin V2, we can activate the 2nd UART in configuration.h
/**
* Select a secondary serial port on the board to use for communication with the host.
* Currently Ethernet (-2) is only supported on Teensy 4.1 boards.
* :[-2, -1, 0, 1, 2, 3, 4, 5, 6, 7]
*/#define SERIAL_PORT_2 2#define BAUDRATE_2 250000 // :[2400, 9600, 19200, 38400, 57600, 115200, 250000, 500000, 1000000] Enable to override BAUDRATE
According to the documentation, the UART RX and TX pin are D16 and D17 on EXP1 connector on the MKS Gen L V.1 board. GND pins are also available on J25.
So now we can test with a USB to serial converter. Wiring TX on D17, RX on D16 and GND on GND.
And it work perfectly ! we are now able to connect to the board with Pronterface with either the USB or USB to serial board
Since we started building Embosser, some users where asking a particular feature : The ability to emboss Braille from a smartphone, especially Android since it is the most widespread terminal in Africa. This summer, we worked on the WIFI feature, with some fail and some promising success.
Desirable specifications
- The system will provide a way to connect an Android phone with a BrailleRAP with WIFI or Bluetooth.
- The Android application will provide the ability to input some text, translate the text in a Braille standard and send it to BrailleRAP.
- Obviously, as unsighted peoples use Android smartphone. The application will use accessibility Android features to enable use by unsighted people, like AccessBrailleRAP software.
- The Android application must be able to work without internet connection.
- The Braille translation must be up to date and adapted to different Braille standard, like AccessBrailleRAP do with LibLouis library.
- The android application must be multi languages.
- Depending on the technical solution, it would be nice to provide an "upgrade kit" for existing BrailleRAP as there is several BrailleRAP in the fields.
- A WIFI BrailleRAP may be still usable with USB connection to provide use with a laptop. Mobile phone is a nice feature, but laptop may provide most advanced usage.
- The solution may be cheap, a complete BrailleRAP is about 250 $ it wouldn't be reasonable to offer a 100 $ WIFI solution.
- And the last but not the least, all the system must be open source licensed like everything in BrailleRAP ecosystem.
Available technical solutions
The first idea that come in mind is using a Raspberry PI. In the BrailleRAP team we are using our 3D printers with octoprint a nice Raspberry Pi distribution, allowing you to use your 3D printer with WIFI and a user friendly web interface. Octoprint is specialized for 3d printing, and you can use it with BrailleRAP, but this is not the user experience we want. With some software development we can build a solution with the Raspberry Pi as Access Point, an option to join an existing wifi network and providing a web application like AccessBrailleRAP. A Raspberry Pi Zero W cost around 20$ and can fit in BrailleRAP frame. It would be an elegant solution, but we have the feeling that using a Raspberry Pi would be a little overkill and may be hard to maintain as our current users are not comfortable with linux systems.
The second idea is to use use an ESP32 base board. Currently we use MKS Base or MKS Gen L, but MKS offer the MKS TinyBee which is a 3D printer board base on a ESP32 MCU with Wifi connectivity. But what about upgrading existing device ? You still can exchange the board with a MKS TinyBee, but you will gain some unused boards in the hand. It would be an elegant solution to add an ESP32 in BrailleRAP frame giving access to WIFI connectivity and providing a web mobile application. For communication with the MKS Gen L board, Malin 2 firmware provide a solution to enable a second UART for serial communication. You can still use the board with USB and you have a second channel to send GCODE commands to the controller board.
As there is still a global shortage on Raspberry Pi Zero W, and i have some ESP32 DevKit and an MKS TinyBee on my desk. I choose to give a try at the second solution
About a week ago, we made some tests integrating pandoc in our Braille translation software AccessBrailleRAP.
Once you get a good translation algorithm, the major issue about translating document in Braille, is that Braille characters are fixed size characters. Basically, Braille is a tactile alphabet where letter are composed of a 6 dots matrix (or 8 dots matrix), each combination in the matrix corresponding to a letter.
When you read Braille letter, all the 6 dots of the matrix must be under the same finger, this is why Braille is fixed size and normalized.
Therefore, you can't represent big Braille character nor little. As you can represent capital letters and number, there is an 'emphasis' encoding available but no way to print Italic, underline, bold or all our favorite typographic effect. So, to translate a word or openoffice even a markdown or an html document in Braille, you need a tool to extract plain text, trying to preserve the text presentation with space characters. This is where pandoc is useful, pandoc is an open source tools to convert document in different format, you can convert an open office .odt to pdf or a Markdown document to .odt, and you can convert all this beautiful file format to plain text.
As we have a python backend in AccesBrailleRAP, integrating pandoc is not an issue. Just adding an import button on the GUI write the 10 lines of code to get a filename, give it to pandoc and get plain text back. But you need to install pandoc and have it available in the path for AccessBrailleRAP software. As the operation is a little tricky for most of our BrailleRAP users, we decide to go a step further and build an installation script to install everything you need to use a BrailleRAP.
At this time, if you want to use AccessBrailleRAP, you need to install :
- The virtual com port drivers to communicate with the BrailleRAP MKS Board
- pandoc
- Chrome
As and installation tools, we choose NSIS, an open source tools to build installation software. NSIS allow you to build an installation script just by providing a little script defining which file you want to include and where you want to put it on the PC.
Once your installation script defined (we intensively use the tutorials provide by NSIS), you just have to use the NSIS compiler and you get an installation script : AccessBrailleRAPSetup.exe
you can select if you want the drivers or not
And now, you have everything installed in order to use a BrailleRAP, plus the AccessBrailleRAP in the windows start menu.
Since we have tried MusicXML files a few days ago, i was wondering if there is a pandoc module available in python.
Pandoc is a well known open source command line software to convert file format, you can use it to convert html to pdf or Markdown to html ...
The main issue with word processor formats is that they can contain many features that are not really available in Braille. Different font and font size are not available in Braille, just because Braille characters is a normalized fixed size matrix of 6 or 8 dots depending of the Braille standard.
So to convert an Open Office .odt file or a word .doc you need a tool to extract plain text from these files. This is were pandoc can be useful, pandoc as a feature to extract plain text from many file format.
After some internet search i found pypandoc, a python module to bridge pandoc with python software.
So i start a little test with AccesBrailleRAP. Just like we already done it with MusicXML, i add a bacend python function which ask a file to the user, convert the file to plain text with pandoc, and return the result to the javascript frontend.
and i just give it a try with a little openoffice test, just a text line with some format and a little table.
Starting AccesBrailleRAP i test the new import button
and select our open office .odt test file
Not bad, we get all the text, and the table is conserved
Converting that to Braille
Gotcha, we have a proof of concept. we definitely need to build an installer to include all the needed software to work with AccessBrailleRAP (drivers, pandoc, ...) but this a promising feature allowing anybody to open a word processor file, convert it into Braille, and emboss it. You don't even need to know anything about Braille !
Since we started the BrailleRAP project a few years ago, we eared some peoples asking about Braille music score.
As we were focused on literacy transcription, we just put aside this feature, just noticing that Braille music score is just another standard in the Braille word.
I'm not a musician, but sometime i love to make some music hack with bio sensor, midi and software synth on Raspberry PI. As hacking around with musical open source software, i've discovered MuseScore (https://musescore.org/). MuseScore is a wonderful and impressive open source tool to edit and display music score, as well as handling many music file format.
Last night, looking for some python module for a customer, i just found music21 "a toolkit for computer-aided musicology" (http://web.mit.edu/music21/#) and this article by Young Choi about Braille and music21 (https://www.linkedin.com/pulse/convert-music-xml-braille-music21-young-choi/) with just a few lines of python to translate a musicxml file into Braille. Starting from Young Choi example, i just create a python venv and write a little script :
import music21
import sys
print (sys.argv)
c = music21.converter.parse (sys.argv[1])
c.show ('braille')
#c.show ('png')
bu = music21.braille.translate.objectToBraille (c, maxLineLength=28, showHeading=True)
data = bu.splitlines()
for l in data:
print ("{0} |{1}|".format(len(l), l))
getting some musicxml around, i tried the script and get this
a pretty well formatted unicode string with Braille music score ! wow !
if you read about BrailleRAP, you know that AccesBrailleRAP, our Braille translation, software is based on eel python module. Eel allow you to embed html/javascript frontend and python backend on a packaged application for window.
So i just started a new app for AccesBrailleRAB, calling it MusicBrailleRAP, add a page and button on the react.js frontend, and drop some python lines in the backend. With a little python function, we can open a file selection dialog, convert the file to Braille characters strings with music21 and return a result as JSON for the react frontend.
@eel.expose defget_file_dialog():
root = Tk()
fname = filedialog.askopenfilename(initialdir = "./",title = "Select file",filetypes = (("musicXML","*.mxl"),("all files","*.*")))
print (fname)
root.destroy()
linel = int (app_options['nbcol'])-1
c = Noneif zipfile.is_zipfile (fname):
with zipfile.ZipFile(fname, mode='r') as archive:
with archive.open ('score.xml') as score:
print ("trying to open a msz")
data = score.read()
c = music21.converter.parseData(data)
else:
print ("trying to open a xml")
c = music21.converter.parse (fname)
bu = music21.braille.translate.objectToBraille (c, maxLineLength=linel, showHeading=True)
data =str(bu)
js = json.dumps(data)
return js
After a few hours of tests and updates on the frontend and Braille paginator, i can now use MuseScore to open a music score.
i use the export feature to export the score in MusicXml format
Open MusicBrailleRAP
Use the "Open a MusicXML file", select the previously exported file .mxl
And i got this
"Et voila", just loading a paper sheet in my BrailleRAP, select a page and use the Print button. 2 minutes later i had a music score embossed on paper, ready to start some tests with unsighted musicians !
Of course there is a lot of works to do :
- Check MusicBrailleRAP is still NVDA compatible.
- Work on UX with unsighted users.
- Check that the Braille music score is friendly usable
- Adding some error check and control on MusicBrailleRAP
- Test, share, experiment with users
But now, we have a start for a new feature. And it's exciting, what can we do with Braille music score in workshops, what can we experiment in public Fablab ... Just wait and see.