Well, I'm just wondering if there is any tutorial/hint, how to create videofiles that can be played via the videoplayer.bin-application from Krikzz (http://krikzz.com/pub/support/everdrive-md/module/bin/videoplayer.bin) Anybody knows???? Many greetings...
From the source we see The video is 28 tiles wide, by 21 tiles tall. A tile is 32 bytes, being 8x8 pixels, each a nibble wide (16 colors). u32 is 4 bytes, and the * 8 inside the array gives 4 * 8 = 32 bytes; so img_buff is the tile data for the video frame. Looking further, we find the frames loaded as such where addr is the sector buffer set just prior to this function. A full block (512 bytes) is read into pal_buf, which is 256 words so it all fits. The routine then goes on to read the data into the tile buffer, img_buf. Here's a slight bug in the player - it reads the image data a sector at a time (512 bytes); the problem is the tile buffer is 28*21*32 bytes, which divided by 512 is 36.75 sectors. So the last sector will overrun the tile buffer. Hope there's nothing after the tile buffer that's important! So the file format is just a raw dump of palette data followed by frame data, where you have 256 words of palette values followed by 36.75 sectors of tile data. Note that although it reads 37 sectors, the file pointer is advanced only 36.75. Look at this See? The file has the palette and then the frame exactly the size the arrays are defined for, not the amount read. Hope that makes sense. The data is then DMA'd to the vram and the screen scrolled to show the new data. The code then loops until the file runs out of data or you press B. So, to make your own video, you need to convert each frame into a 224x168 resolution 16 color image. You then need to save the palette (16 words of 9 bit BGR values padded to 256 words), then the raw frame data with 4 bit pixels, but in TILE format! Six pack by mic can give you the palette and 16 color image data for each frame. Then you have to concatenate all the data. Easy, no? Hopefully KRIKzz has something that automates that procedure a little.
i have tool which can concatenate all frames and palettes in single dfatafile, also i have tool which can convert 16 color images in sega format, but still need alot of hand work
Someone please do the Rickroll! Been done for SNES, time for an MD port I think. Rename it as Sonic 1 Alpha and ship it over to Sonic Retro.
What you need is a shell script that dumps the frames to individual pictures, then uses something like imagemagick to resize the frame, then runs sixpack on each frame to get a bin of the palette and frame, then pad the palette, then concatenate the palette and frame for each. I might try my hand at that this next week. It would be a bash script targeting linux, not Windows, although it might work in CygWin.
You can dump a clip to jpeg or png using "mplayer -vo jpeg -nosound clip.avi" or "mplayer -vo png -nosound clip.avi". Mplayer can handle just about any input format (just used avi as an example), and you'll get a nice set of images out like 00000001.png 00000002.png etc. Mplayer also has a lot of options you can apply to the video, like resizing and such. Then it's a matter of using sixpack on the frames to get the raw data needed.