Thursday 3 November 2011

MAME on your Mac right now

Seriously all Mac MAME front-ends are dead projects. To get a proper MAME application for my Mac I had to bundle it myself. To spare you the same trouble here is MakeMame4Mac. It's a set of files that download and build everything to make a MAME application bundle that runs smoothly on your Mac. Just download the archive, uncompress it and run make. It is tailored for 64 bits processors (core2), so you may need to tweak the configuration if you have a different processor. The Makefile will download and build SDL and MAME and bundle the whole package in a nice Application Bundle. You will end with a Mame.app, just copy that in your /Application directory and you're good.
Note this is just a simple bundle for MAME, not a front-end. To configure the location of your ROMs you will need to edit the make.ini file by yourself. This file will be located in your $HOME/Application/Mame/mame.ini. It's just the standard make.ini file, check the MAME documentation to set it . If the $HOME/Application/Mame directory is not here, just run Mame.app once and quit, this will create it. All the different vnram files and co. will also be created under $HOME/Application/Mame so you won't get anything .

Project is available here.
A build for Mac OS X 10.7 Core2 proc is available here.

Enjoy!

Thursday 27 October 2011

Make a Mac OS X application bundle for your Linux app

That's something pretty handy and no so much documented. I mean you have plenty of examples on how making you Xcode project into a nice Mac OS X application, but when making a cross platform application using the good old Make, you don't really want to have two building systems on you hands. Plus you will need to embed somehow the dependency libraries that are not provided by Mac OS X. But rejoice, you can build a Mac OS X application bundle from scratch without to much of a hassle.

For this solution, I heavily inspired from the GTK+ application bundler. The principe is quite simple, you need to create an application bundle directory structure as described here. Place you executable in the directory Content/MacOS, and all your libs in Content/Resources. Now the Info.plist. This file will be the entry point for running you application bundle. It can do a lot of things, but we will use the bare minimum here (define the startup executable and the icon):

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist SYSTEM "file://localhost/System/Library/DTDs/PropertyList.dtd">
<plist version="0.9">
  <dict>
    <key>CFBundleName</key>
    <string>PouillotPouillot</string>


    <key>CFBundleDisplayName</key>
    <string>Pouillot Pouillot</string>


    <key>CFBundleIdentifier</key>
    <string>com.PouillotPouillot</string>


    <key>CFBundleVersion</key>
    <string>0.0.2</string>


    <key>CFBundlePackageType</key>
    <string>APPL</string>


    <key>CFBundleSignature</key>
    <string>puyo</string>


    <key>CFBundleExecutable</key>
    <string>launcher.sh</string>


    <key>CFBundleIconFile</key>
    <string>pouillotpouillot.icns</string>
  </dict>
</plist>


The icon must be in the Content/Resources directory. You noticed here we don't directly use the binary to startup the application, but a launcher script. This is the key element of this bundle. It's a neat trick to save us the pain of creating a Framework bundle with all the libs required by our application. Let me show you this script:

#!/bin/sh
name="`basename $0`"
tmp="`pwd`/$0"
tmp=`dirname "$tmp"`
tmp=`dirname "$tmp"`
bundle=`dirname "$tmp"`
bundle_contents="$bundle"/Contents
bundle_res="$bundle_contents"/Resources
bundle_lib="$bundle_res"/lib
bundle_bin="$bundle_res"/bin
bundle_data="$bundle_res"/share
bundle_etc="$bundle_res"/etc


export DYLD_LIBRARY_PATH="$bundle_lib"


exec "$bundle_contents/MacOS/pouillotpouillot"

This where the magic happens. We retrieve the bundle directory, and its various sub-directories, and then set the DYLD_LIBRARY_PATH variable to be able to load properly the different libraries in the Resources directory. This script does the bare minimum, I encourage you to check the launcher script of the GTP+ app bundler.

That's it we're done, but you still have to be very careful with the loading of your libs with the other from Mac OS X. There can be conflicts. For instance if your application embeds libpng you can get this kind of error at startup:

dyld: Symbol not found: __cg_png_create_info_struct
  Referenced from: /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ImageIO.framework/Versions/A/ImageIO
  Expected in: ///Users/antoine/progs/pouillotpouillot/PouillotPouillot.app/Contents/Resources/lib/libpng15.15.dylib
 in /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ImageIO.framework/Versions/A/ImageIO

What happened is your application loads its libpng library, then loads ImageIO frameworks, and ImageIO framework itself uses libpng. But instead of getting the libpng from /usr/X11/lib that was expected, it gets the one from your application bundle. And yours is different from the one from the system. So you better use the one from the system. In case you really need your version and avoid the library from the system, or seems you can use DYLD_FALLBACK_LIBRARY_PATH, but i didn't tried it yet.

Hope this will help you make more cross-platform applications. It's still a pity to see nice apps not available for your beloved system ;)


Tuesday 6 September 2011

MAME ROM Updater

I started a little project to update your MAME ROMs. The main problem with MAME is that they frequently change the ROM names and organization depending on their progress. It makes it very difficult to find the correct ROM for the correct MAME version. This Ruby script intent to parse the MAME source code and make a list of the valid ROMs, and check your actual ROM archives to validate them, fix them if possible and give a status of what is missing. For the moment it only deals with simple cases. It still need a lot of work to handle ROMs using BIOS dependencies or some ROM definitions using some include directives in MAME code. Yet it can still be handy as is,. Beware it is not yet properly documented. It should work on any decent OS (by this I mean MAC OS X and Linux) as it require only some classic shell tools (ruby, zip, shasum, cut, etc..). You can get MAME ROM Updater here.

Wednesday 31 August 2011

Neo-Geo bios project

Yet another project on the stack. My Neo-Geo is begging for some hacks. And this BIOS is a great one. The main inspiration in the Universe BIOS (or UniBIOS for short). A custom BIOS that provides awesome features like region protection bypassing. This BIOS is the probably the best available for Neo-Geo hardware, with a single problem: it's not open source. I know it's being a little picky. You can can freely download the 2.3 version firmware, and get the latest 3.0 already burned in an EEPROM for a reasonable price (25 euros). Well you begin to know me, and I just can't go the simple way. I want to develop my own open source Neo-Geo BIOS.

First step: remove the original BIOS

There are basically two technics for mounting a custom BIOS. First one is piggy back, where you basically kill the existing EPROM and solder the new one on top of it. It is meant to be easier, but as name says it: dirty as pig. I prefer the second option: remove completely the old EPROM and replace it by a socket. Any you can plug whatever you like in this socket. So that will be step one, and let's hope not the final step in case I fry the board.

Step Two: Get/Build a proper EEPROM

The Neo-Geo BIOS chip is a 1M bits (64Kx16bits) memory, operating a 5v and with 120 ns access time (TCS31024P-15). There are some trouble to find these kind of memory nowadays. Most of 64Kx16 bits memory are discontinued, you can find bigger or smaller, but hadly the exact stuff. Here comes the main part of project: building a PCB to replace the original EEPROM with two Flash chips. I checked the measurements and it should fit. I just have to figure out the proper layout.

So let's see how this will go.

Thursday 25 August 2011

PS button difficulties

Activating this PS buttons is more difficult than expected. I found different sources that claims to succesfully make this button work:
The first one is a nice project using a breakout board with an AVR to implement a USB controller for PS3 with home button enabled. The second one is another project with AVR chip, it's a controller simulator that gets its input from a serial port intead of directly checking button states. Both projects provides complete source code which is pretty nice. The third one is a thread discussion of people trying to make the button work, it gives some information of what seems required or not but gives no code whatsoever.

Everybody seems to agree the magic bytes are:
0x21, 0x26, 0x01, 0x07, 0x00, 0x00, 0x00, 0x00
Everybody agree these bytes should be sent to the host after the HID report descriptor. But it is not clear if these bytes should be sent in the data stage of the setup request, or in a spearate transfer just after. But from what people says in the thread, it seems it should be in the data stage. But it's just speculation as I can't find any confirmation in the projects source code. For the first project, the magic bytes are sent when the hosts does a HID class request GET_REPORT (not on a standard request GET_DESCRIPTOR that actually sends the HID report descriptor). And for the second project, the bytes are defined but does not used at all.

As the only project that seems to potentially work is the first one (it actually uses the magic bytes), I tried to use the same HID report descriptor and report data structure they have, added the piece of code sending the magic bytes on GET_REPORT. But no success. I don't event get the regular button events to be catched by the host. I highly doupt the report descriptor does not correspond the actual report data strucutre, in additoin this report descriptor contains some Sony specific stuff. If they are required to make it work on PS3 i'm pretty sure it will make it fail on Mac. The PS3 never send any GET_REPORT request. Not sure if this is due to the fact I use two endpoints (ep0 for control ep1 for Interrupt) whereas the project uses juste one endpoint (ep0). I don't think so as the original PS3 controller uses three endpoints. The number of endpoint should not alter the behavior.

It is pretty unclear what are the conditions the HID report must match to work with the magic bytes and make the PS3 happy. The thread lists some requirements point, but again it's just conversational and not strictly clear (is order important? etc...). It is not clear also if a specific vendor/product ID should be used. I guess no, because I doupt Sony does a firmware update anytime a new controller construcor pops-out.

I'm going to do some more investigation with different report variations. It saw the original PS3 controller defines some Feature fields in the reports, and there features sizes are 8 bytes just like the magic bytes. May be a hint?

Tuesday 23 August 2011

USB Joystick update: It works with PS3!

It works with PS3! After all this time trying to understand what was wrong in the communication protocol (DATA bit setting etc...), I just figured out what was holding my USB Joystick project to work on PS3. And it doesn't really makes sense. Just as a reminder, I'm developing a USB Joystick to work with PS3 and PC (Mac in my case). The Joystick worked just fine on my Mac, but poorly on PS3 (see video here). Turns out the DATA bits were completely messed-up at this time (can't really tel how the Mac managed to make it work nice). I fixed all this and ended up with the joystick still working nicely on the Mac but nothing at all happening on the PS3. After some investigations I noticed the PS3 sends the following request to the device:
HID class request SET_PROTOCOL
This request is supposed to be sent to Boot devices only. But my device is not, I triple checked the descriptors and there is no subclass defined on device or interface. As stated by the USB HID standard this request is required for boot devices only, so I was invalidating the request with a good old Stall. Pretty normal, I was sticking to the specs. Then just to checking I enabled the call: just don't Stall the endpoint. And all of a sudden all just worked nice. The PS3 was accepting my data. I still can't tell whether or not it is normal for a non-boot device to have to implement this call. In fact, just after the SET_PROTOCOL, the PS3 sends a GET_PROTOCOL, I figure just to check if the setting went OK. I Stall on this GET_PROTOCOL call, but it does not seem to bother the console. In a nutshell, I'm pretty happy to have it working at last, but a bit puzzled by what I had to do for it.

Next steps:
Make Home button work. Shouldn't be too difficult there are some projects explaining how you have to send some magic data in the report to have it work.
Retrieve the controller number. I guess I'll need to use a OUT ep with some specific report to get the data from the console. Just hope the PS3 will consent to send me the info.

Friday 29 July 2011

Using an Arduino as an USB UART

As you know I'm working on some modest electronic projects involving microcontrollers. And when you debug these little beasts, having a text output is no luxury. The best way to do this is by using a USB UART adapter. But when you don't have one, maybe you are still lucky and have an Arduino Duemilanove lying around. This Arduino uses an FTDI chip as a USB USART interface for the AVR chip. You can check the datasheet of the FTDI chip here (pdf), and the schematics of the Arduino here (pdf).

Sweet, all the parts are here, now let's put all this together.

First you need to program your Arduino to set the RX and TX pins to Input. This will set the logic gates of the pins to a open state that won't conflict with the signals comming from our own chip. To do this just create a new sketch in the Arduino development environment and copy this code:

void setup()
{
    // put RX and TX of the AVR to a rest
  pinMode(0, INPUT);
  pinMode(1, INPUT);
}
void loop() {
}

Upload this to the Arduino and run the Serial Monitor. Note you have to leave the RT/TX unplugged when programming the Arduino. You can now set the data transfer rate to an appropriate value. For my case I will use 115200 bauds.

Now we can hook our own microcontroller to the FTDI chip. Link the TX of your chip to the TX of the board (it actually corresponds to the RX of the FTDI chip). That's pretty much the only hardware manipulation necessary.

Last step you need to setup the USART of you microchip. Here is an example of how I set it for my project using a PIC18f. This example uses the sdcc iolib:
  usart_open(
             USART_TX_INT_OFF &
             USART_RX_INT_OFF &
             USART_BRGH_HIGH &
             USART_EIGHT_BIT &
             USART_ASYNCH_MODE,
             103);
  BAUDCONbits.BRG16 = 1;
  stdout = STREAM_USART;
  printf("Starting USB\n");

You will need to calculate the proper Baud Generator value for your project. Refer to the datasheet of your microcontroller for this. Here 103 corresponds to the following calculation taken from the PIC18F2450 data sheet (using 48Mhz clock, wants 115200 bauds, 4 is because using Aynch transfer with BRG16 and BRGH): ((48000000 / 115200) / 4) - 1.

Just compile that, upload to the boad and the debug message magicly apears on the Arduino console:
Starting USB

Bad kitty ! Bad !!

Can't resist the urge of sharing my frustrations with the new coming out of Mac OS X Lion. As a start, let me explain that I didn't upgrade to Snow Leopard when it came out. This version didn't bring groundbreaking changes (well, none I needed) and I kept working with regular Leopard and everything was nice. But as times goes, some few software begging to be available for Snow Leopard only, and Leopard get more and more on the verge of being deprecated. Nothing more natural than that. At this time Lion was already pointing the tip of its mustaches. So I thought "hey, let's just skip Snow Leopard completely to jump on Lion, just need to hold for a month or two with some old software". Then the first bad new came: it is not possible to install Lion without Snow Leopard. OK... fair enough, I'm still waiting, hoping Snow Leopard will get a prince-cut on Lion release. Wrong again! Lion is out, and Snow Leopard if still at 30 bucks. OK I get the message Apple, I'm going to buy your goddamn manslaughter feline (I already begin to feel the sharp teeths and claws digging though my skin). I went to the local Apple store, guess what? No more Show Leopard. I went to another local Apple dealers, same thing. OK... So I'm going to buy it online. Guess what? I ordered it on July 28th and it will shipped only in August 11th! Won't get it before the 15th. Almost three weeks! It better not hiss on me when I'll install it.

Wednesday 6 July 2011

Install Ruby with TK on Windows XP

Developping a small graphic cross-platform application with Ruby? Seems TCL/TK is the way to go as Ruby support for TK is part of the core development, and TCL/TK is available for most platforms (Windows/MAC/Linux). Even if everything for Ruby/TK is packaged by default on MAC OS X (and I guess Linux), it's not the same for Windows. The Windows RubyInstaller does not provide TK extentions anymore, and despite the numberous Howtos on the net, none really precises which version of each component you need. And the different versions require different installations. So here is a quick how to install Ruby with TK support on Windows XP.

Basically you have two options:
  • Using Ruby 1.8. In this case you need to install some existing ruby tk binaries.
  • Using Ruby 1.9. In this case you need the ruby tk gem to build the binding yourself. I didn't managed to get this option working, but I still explain here how the installation works.

Install TCL/TK

This step is common to both installations. Use ActiveTcl convenient installer you can find here: http://www.activestate.com/activetcl
For Ruby 1.8 you need version 8.4 instead of 8.5 (the latest at this time).

Install Ruby

Use the convenient RubyInstaller for Windows you can find here: http://www.ruby-lang.org/en/downloads/

Install TK binding for Ruby 1.8

Get the archive at https://github.com/rdp/ruby_windows_tk and decompress it on your drive. You just need to cd in this folder and run:

ruby install.rb
This binary installation is meant for Ruby 1.8.5, but it seems to work pretty well for Ruby 1.8.7 (latest at this moment).

Install TK binding for Ruby 1.9

In this case you will need to build the binding yourself, so first we have to install the RubyInstaller Development Kit. It contains the compiler dans files to comile gems and you can download it there:http://rubyinstaller.org/downloads/

Note to install the devkit you have to follow these few steps (taken from: https://github.com/oneclick/rubyinstaller/wiki/Development-Kit)
  • Run the bin and uncompress data in <DEVKIT_INSTALL_DIR>
  • Start Command Prompt With Ruby
  • cd <DEVKIT_INSTALL_DIR>
  • ruby dk.rb init
  • ruby dk.rb install
Now let's install the tk gem. As explained here https://github.com/rdp/tk_as_gem, Ruby TK gem is a little hack with the regular Ruby 1.9 sources for TK embedded in a gem. To build it use the following command line:
gem install tk_as_gem -- --with-tcl-dir=c:\Tcl --with-tk-dir=c:\Tcl
Testing Ruby TK

Now everything is in place, run the follogin code in the ruby console:

require 'tk'
root = TkRoot.new { title "Ex1" }
TkLabel.new(root) {   text  'Hello, World!'   pack  { padx 15 ; pady 15; side 'left' } }
Tk.mainloop

This should display a simple window with a label.

Thursday 30 June 2011

Supergun project

I know i didn't even managed to get my USB Joystick to work completely yet, but I have a Supergun project in mind. It's a long term project as it will be composed of different parts, each requiring different electornic skill that promise long hours, day and month of enjoyment.

For those who don't know what a Supergun is, it's a device that enable to play arcade PCBs on regular TV/monitor screen and regular controller instead of a dedicated arcade cabinet. It's basically composed of three parts:
  •  Power supply. The arcade PCBs require 12v and 5v power and even -5 for some boards
  • Controls input. The arcade PCBs provides pins to connect controllers, the super just exposes then to connect regular ones.
  • Audio/Video output. Arcades PCBs output RGB signal at 15Hz,this video signal is transformed to VGA or PAL/NTSC to regular Monitor/TV set. the audio itself is mostly untouched.

There exist numberous Superguns on the market, most are home made, but none really fits my needs. And I'm quire surprised by this as some features seems pretty useful. So I intent to create my own with an awesome list of features:
  • Integrated Power supply. I know it would be much easier to get a PC ATX power supply or any kind of existing external power unit, but I really look for something that can sit on a table without taking too much space, and won't require a lot of noisy cooling fans. Plus power supply is an awesome electronic project. I must admit I don't know how many Watt a JAMMA board can suck, but I think it will need much less than the 350 or 400W of an ATX PSU. In addition I'd really like to be able to monitor the power supply state (mostly consumption) and build some fine tuning mechanisms. Some arcade boards are pretty old and may require some specific tuning to work.
  • Raw pins for control inpout AND USB. All superguns just exposes the JAMMA pins to connect a simple controller. It's pretty simple and highly efficient, but it lacks flexibility. Mostly the possibility to add 4 players easily, or plus your USB Joystick you already use on PC or console, or the possibility to use exotic devices some PCBs uses like trackball or driving wheel/pedals. The USB interface should enable to plug regular PC devices and emulate the JAMMA device for the board.
  • USB service port. I especially want to be able to plug the Supergun to a PC. The features we can imagine with a Supergun as a USB device are numerous, but I mostly want to be able to record the audio/video, to update the firmware, and configure the settings (buttons layouts, power, etc...).
  • Video upscaller. I intent to output a RGB VGA signal rather than TV PAL/SECAM. Mostly because it will require just a video upsaller to double the scan lines. Support PAL and/or SECAM just adds to the region problems with different TV sets. VGA is simple and nowadays even TVs have a VGA input. A nice to have is a scan-line generator integrated with scan line doubler. Just making the added lines darker to give a fine CRT feeling on LCD. Offering the possitility of video inversion can be fine as some PCBs output inverted video to projectors, but usually these boards has some configuration pins for video inversion, and this would surely complexify the video circuit and add latency. So not required, but nice to have.
  • Open source hardware and software. I intent to publish everything, because let's face it, I'm not an electronic engineer or an awesome coder. So help from community is always a plus. I really intent to build the hardware with plenly of room for some cool software features I didn't tought yet.
A great project, a lot of ideas, but I really should finish this USB Joystick first.

Tuesday 28 June 2011

USB Joystick

For some time now I try to develop an USB Joystick out of a broken one. The goal is to remove the faulty pcb board and replace it with a board of my own design. It's a much longer and harder process than I though, but it is quite rewarding and I learn a lot in the process. Here is a small video presentation of where is project is at.

Hello World

I'll post here my various hacks and developments projects I'm working on. And maybe one day I will be able to actually finish one of them!