Páginas

2011/07/26

X-SendFile or how to send a big static file to the client


X-SendFile is a header that can be sended from a scripting language like PHP, Python, Perl, Ruby through CGI (or equivalent) or in a FastCGI applications that is preprocessed by the http server (and it won't be sended to the client) and specify to the server that have to serve a static file instead of sending the CGI output.

In the case we want to send a big file from a CGI using authentication. The traditional way was yo establish the header Content-Type and send all the binary data. Some sophisticated programs even parsed HTTP headers in order to send only chunks of the file. However, each one of those request use memory, handles and processes during all the transfer, communicating with the http server that is performing as a bridge, sending all the data to the client.

Using X-SendFile you only need to send that header specifying the path to the file we want to send, and to finalize the CGI/FastCGI request. The script is in memory only the time that you need to verify the authentication and to send only a single header to the server. This is much more efficient. Even we can store in a session that the user have permissions to downlaod the file and we will be able to access that session in a fast way. The http server will be in charge of handling the headers to send the data the user request.

In PHP:
header('X-SendFile: /path/to/my/file');

The X-SendFile header is implemented on the most populars http servers, in the main core: lighttpd and nginx, or with a mod mod_xsendfile for apache.

Each server handles the header in a way, allowing to specify configuration parameters and security restrictions per configuration. You will have to read the documentation of your server for more information.

2011/07/24

DPspEmu r301, as3exec, CSharpUtils.Templates and Doctors without Borders


Have passed lots of code lines since the last time I wrote about programming here and I'm going to make a summary of all the news of my OpenSource projects. There are a lot of news, and some of them are thrilling! Probably I will write some specific post of every of these projects.


I have just released the revision r301 of my PSP emulator. The las release I did was after more than a year. I have working on it several months in my spare time. Though I was waiting to release it.
Initially I was waiting for the commercial part of http://kawagames.com/ to get out, though I decided to release it before, since it was taking more time than the initially expected. At any rate, the commercial part of Kawagames should get out very soon. As soon as the developers we are working with, finish updating their games and upload them to our site.


So, let's go. I'm going to start with a summary of a mini-changelog (since the changes are so vast that it is pointless to write it down):


  • Some commercial games are running! Check our compatibility list, and feel free to add missing games.
  • New entirely remade multithreaded cpu core!! It will use a native thread per psp cpu thread. So it will get faster on newer cpus with multiple cores. Lots of games have several psp threads, one for the game, other for decoding audio... before this release cpu was executed on a single thread, now it will use one windows' thread per psp thread and executed parallely on multicore cpus withouth any penaly of switching psp threads because it won't switch anything.
  • Gpu morphing and skinning!
  • Synchronized components throgh mutex and events instead of polling. (Still not perfect). A better and faster design that eliminates some bottlenecks.
  • More homebrew compatibility!
  • Now using eclipse as IDE, using DDT to get autocompletion! Much more productivity.
  • Cleanups
  • More APIs implemented
  • hq2x and output scaling...
  • Cheats (enabled by command line)
  • Command line tools for advanced users. "pspemu.exe --help" to see a list of commands.
  • Lots and lots of work
  • Background music (Atrac3+ support) with SonicStage (and WaveOut Codec) installed.
  • And more...


Lots of news and very exciting.

Multithreading per cpu thread is something very rare to see in a console's emulator. It can be only achieved with an HLE emulator, and on consoles with a single core (the PSP for example) where thread synchronizing was predictable, it can cause that some games that aren't using synchronization primitives right, doesn't works as expected. Though it is possible to emulate an enviroment singlecore pausing and resuming threads. And it is possible to create a compatibility mode using this idea so games bad designed could work well.

Commercial games haven't never worked because of a bad implementation of the module loading, and the relocation of code. And all the PSP games use PIC (Position Independent Code).



It can run Tales of Eternia. It was my first game translation, my first hacking, my first introduction to compression algorhitms... And now, also one of the first commercial games my emulator can run.

I will write about more details and interesting things of the emulator in future posts.




After suffering a lot making bad unittesting with ActionScript3 using ASUnit. And with the impossibility of being able to automatize the execution of tests. Which is a fundamental practice in the processes of continuous integration, I decided to do something about that. And the result was this OpenSource project.

The idea is pretty simple. Using .DET, I create an invisible window where I put an ActiveX component with a Flash Playre, which version I can chose seleccting the apropiate OCX file of the Flash Player Version I want to execute. It sets a set of methods to the ExternalInterface of the ActiveX to allow writting to the standard output and stopping the application (Just what I needed to automate and analyze testings).

I created the C# application, the ExternalInterface and a very simple framework to do unittesting.

So if you want to make TDD using AS3 and do you want to automate your tests, don't doubt on using this project. It has a very few commints, but I think it is simple enough to be considerer somehow stable and using it on real project.




This is the project I have been working on lately. I started it some time ago, but I didn't continu it until now.

Lately I have been noticing how slow and inestable is PHP on almost everything. And I have found the ultimate way to fix that slowness and inestability of PHP: Do not use it.

(Note that even when I'm mad with PHP, I'm a PHP Certificated Developer, and I'm still using it on most of my web projects).

I was several days investigating ways to use True FastCGI on PHP. I have seen the PDPDaemon project, but it didn't fit to my interests. It implements libevent and asynchronous programming. Very hard yo use and to use that, better to use something like node.js. PHP didn't had that in mind and it's very inconvenient to use. Delegates are infumables as soon as you try to access the parent scope and you realize that it is absurdly uncomfortable trying to maintain some of the performance.
PHP has a very poor design, and lots of problem. It is still interpretered and slow. Has plenty of memory leaks, antipatterns that produces lots of bugs and security flaws everywhere.
And you can't event make anything serious with it. It doesn't have threads, and even at any moment you can get a Fatal-Uncatchable-Error that will lead you to kill someone. If you are trying to create a FastCGI server using PHP, the fact of not getting FatalErrors is crucial, since you can't send the output if the FatalError rises. The socket is closed and you can't do anything about that. So if you declared a function twice, you won't be able to send the error through the FastCGI channel. PHPDaemon tries to solve this problem using forks. But it's probably not the most efficient way out there.

Lately I'm using on my PHP projects, Twig. Twig is a very good templating system similar to django templating system. Supporting template inheritance, macros and some other great structs. It's very easy to use and you get very concise and clear templates.

What's wrong with it? That Twig has lots of classes and lots of code to execute every time on its setup. Even when the template have already been loaded. And PHP works in a way that everytime someone requests a page, it starts from scratch.
So that's why I'm interested in using FastCGI. Using FastCGI you can have in memory all the classes, and as many templates and first-level cache as you want. Futhermore you have a shared memory if you are using threads instead of processes. And the library initialization execution is performed only once: when starting the FastCGI server.
But again, true FastCGI is incompatible with PHP and it will be forever and ever.
So with that in mind; and knowing that PHP is not going to change, I will be to one changing.
I was checking out ASP.NET but it was so much "on rails" for me. I want to have a total control over my application. Having a normal application and that every part is being executed for a controled coded I like. And If I discover a new kicking ass way to do things, I can change it without having to depend on the way some company decided I have to do things.

I like django/Twig templates, I want to use a webserver I want (I'm going to use nginx), and I want to build my framework as I want.
I started CSharpUtils project with that on mind. And now it has lots of interesting modules.
I have a module to start a FastCGI server and a class extending it to handle HTTP FastCGI requests.
Also I have a Template System similar to Twig/Django. I think that templating systems have to be comfortable to use and be very flexible and without throwing errors, so I have got it working with dynamic typing. Using .NET 4.0 it's pretty easy.

If you want to do things that have to scale and maintainable over time you have to do TDD. Visual Studio supports TDD in a very comfortable way.
Having static typing and autocompleting helps a lot. On PHP you can't have that on lots of scenarios. And even with Eclipse doing a great work, PHP loses types everywhere and it ends being a pain.
So I have ended with that problem using .NET. I have static typing and a great VM from Mono-project http://mono-project.com/Main_Page that is lightning fast. Goodbye to implement algorithms with nasty code using very high level functions to get it working at a decent speed on PHP. 

BTW: Before using PHP, y deploy flow was updating a folder, removing caches and so.
Now my flow is going to be: compile the executable, embedding (or not) code and templates, upload static files that nginx will serve, stop progressively the fastcgi servers and start the new ones. Also FastCGI allows you to execute a program using any user, so even when someone could get execution access (a very rare stuff with .NET), it would have the privileges of the user how would have launched  the FastCGI instance.




I was waiting for the release of the emulator and Kawagames to make this public: I'm going to make donations to charities fundations from now on. Donating some of the benefits I get from my commercial projects and donations people could give to me.

I'm going to donate the 10% of my whole benefits in my projects: Kawagames, advertising on PSP emulator, donations and my future commercial projects.

I'll perform the donations every trimester, starting from the release of commercial games on Kawagames.


I think that every single person execising commercial activities, should donate a percent to good causes. If all people thinked that way, this world would be a better place to live. So don't doubt it. A single gesture don't make the different, but all people together can make this place a better place to live in.