Thursday, February 16th, 2012

Python ROM chip possible?

Almost every computer science professional knows the history of the personal computer. Do you recall the Apple II series of computers? The Apple II had a ROM chip which contained the BASIC programming language, Integer BASIC. I recently thought about bringing this idea back for embedded systems. There already exists a processor extension in the ARM community which supports a Python JIT, the ThumbEE, or Thumb Execution Environment.

The possibility to have a co-processor which contains the core Python library, perhaps flash-able to newer versions would make Python in embedded applications run very quickly, as the Python co-processor will be processing most of the machine code directly and passing it onto the main processor.

I'm not an expert in processor design, or fully understand how they work internally. However seeing past efforts with languages such as BASIC running from a chip, not sure if the chip just holds the runtime, or actually processes the statements to machine language.

Python is a very powerful language, and I would like to see a stand alone Python compatible processor built where Python code could be bootstrapped directly from NAND flash with no need for an underlying OS. If we are to take embedded systems to the next level, we need to do away with the OS and focus more on devoting all system resources to the application being executed.

A concept of this OS-less system could be built by building a bootloader which runs a Python environment, and loads a specific module. This could be done on current PCs or embedded devices. Python just needs to be ported to retool any of the OS-specific features to talk directly with the hardware itself, and perform various tasks on it own without relying on an OS. A fun usage of having a specialized Python chip would be to build a scientific calculator with it. When powered on, it will provide a Python prompt. While simple calculations are possible, someone skilled in Python can perform very advanced calculations and draw graphs using standard Python libraries.

I'm not entirely sure of the possibility of this idea becoming a reality, but it is something to ponder. I can see great uses for such a co/processor that thinks Python, especially in the scientific community or robotics. Let me know your ideas on this thought in the comments below.

Comment #1: Posted 2 years, 2 months ago by Tshepang Lekhonkhobe

Interesting that you are thinking about this, because I've been pondering of something very similar... reducing the complexity of the way current computers work.

Mine was a bit crazier, as in running Python "directly" on the chip, as if it was assembly language. Given the small size of in-processor memory, I don't know how feasible that would be. Perhaps a subset of the Python distribution/language would do. But then again, I'm not very conversant in this field (hardware architecture, and computer science in general), so maybe I'm talking non-sense.

Comment #2: Posted 2 years, 2 months ago by Kevin Veroneau

That was one idea of mine was to run Python "directly" on a chip, rather the JIT component of Python, whereas the process would be the runtime for Python bytecode, aka .pyc. However since Python is a very Dynamic language, having a chip like this would be very difficult. Thus my idea of a secondary chip, which can at least speed up Python.

Comment #3: Posted 2 years, 2 months ago by mikko

Hi,

We did Python work for Nokia's Series 60 mobile platform and some experiements with Python on Maemo.
ยจ
CPython is too slow for embedded work.

Especially, import is too slow, because instead of bitblitting Python needs to "run" each module it imports. This kills start-up performance.

After application is running the speed is okish, but don't ever think about writing mobile application with CPython.

Python minimum memory usage is about 12 MB (from Hello World as reported by Google App Engine). That's expensive piece of RAM you need for it...

Maybe you should try to forget using CPython based thinking, or coprocessors, and simply use

* PyPi with ARM backend (see if there is static compile options)

* Some other Python-to-C compiler (there are few)

In any case, the development cost vs. value ratio of Python co-processor is not very realistic.

Comment #4: Posted 2 years, 2 months ago by Ode

BASIC ROM ships are just storage for an interpreter. It is language semantics that determine if something is suitable for translation to machine code; most dynamic languages can't be (BASIC is already dynamic, Python much more so), not without advanced techniques like the runtime feedback used in JITs. Burning to ROM gives no performance benefit, making the OS more specialised is the one thing that might have an upside in terms of performance.

Comment #5: Posted 2 years, 2 months ago by Ken Whitesell

Both the Apple ][ (with Integer Basic) and the ][+ (Applesoft - along with the //e, etc) contained the intrinsic editor and interpreter for running the appropriate version of Basic.
Lines weren't compiled in any way to machine code - they remained stored in memory as lines of Basic. However, that doesn't mean that they were completely stored as text - keywords (LET, PRINT, etc) were tokenized into 1-byte values, and those values were an index into a table containing the addresses which processed those byte codes.

To get back on topic, I don't see any reason why you couldn't do as you suggest - except I'm not sure I see the value of completely removing an OS layer. Seems to me like you'd be reinventing the wheel. I'd submit that you could engineer a "custom-fit" linux (or BSD) kernel that would be smaller, more flexible, ready sooner, and probably more robust than replacing those features with a custom-coded core.

I like the idea of a programmable calculator in Python, but even better is one where you can copy your functions to/from a PC/tablet via USB, bluetooth, WiFi, or ethernet. For robotics, I'd think USB and WiFi/ethernet would be a requirement. Why redo all that work when it's already done for you in the kernel? There's nothing saying you can't provide the interface libraries such that handling those devices can be done with Python - that would be cool.

Remember, Applesoft didn't have any native file system (added by Apple's DOS and derivatives), multitasking (no timer interrupts), or network support. It also didn't support (directly) any other external device - those devices were supposed to include ROMs on the interface cards.

Comment #6: Posted 2 years, 2 months ago by anatoly techtonik

P3k unicode will be a PITA. And to communicate with hardware you need good tools to communicate bits and bytes over interfaces, and memory management.

Getting the input interpreted by a system on chip as Python instruction is an interesting idea. I wonder how far it can go with PyPy.

Comment #7: Posted 2 years, 2 months ago by Adam Skutt

You should keep in mind that ARM used to have hardware accelerated support for Java bytecodes, called Jazelle. Modern ARM chips still support the instructions, but no longer provide any meaningful hardware acceleration. Hardware acceleration of "high-level" bytecode languages is pretty complicated and rarely does the benefit exceed the cost. If it doesn't make sense for the JVM, it's very difficult to imagine it making sense for the Python bytecode.

Likewise, putting a Python compiler and runtime precompiled to ARM on a ROM chip doesn't really gain you much and has a huge cost. All you've done is make your Python harder to update for /maybe/ some slight performance gains due to faster memory access. Realistically, faster memory access isn't going to make a difference with Python.

You're also mistaken about what matters in embedded systems. If you have the money, power, and weight to afford an ARM processor, you can also afford an operating system. ARM itself is complicated enough that an OS is really required, even moreso when you consider the whole ecosystem. Plenty of embedded systems lack an operating system (this is where the world started, after all) but also lack processors as powerful as ARM. You're effectively suggesting we go backwards, not forwards.

Likewise, a scientific calculator would never do what you suggest because it costs too much and requires too much power draw. A really fancy graphing calculator might, but even there I doubt it.

Comment #8: Posted 2 years, 2 months ago by Frzn

What about python-on-a-chip?

http://code.google.com/p/python-on-a-chip/


Features of the PyMite VM:

Requires roughly 55 KB program memory
Initializes in 4KB RAM; print "hello world" needs 5KB; 8KB is the minimum recommended RAM.
Supports integers, floats, tuples, lists, dicts, functions, modules, classes, generators, decorators and closures
Supports 25 of 29 keywords and 89 of 112 bytecodes from Python 2.6
Can run multiple stackless green threads (round-robin)
Has a mark-sweep garbage collector
Has a hosted interactive prompt for live coding
Licensed under the GNU GPL ver. 2

Comment #9: Posted 2 years, 2 months ago by David Boddie

I tried to leave a comment before but just got an error page. Adam and Frzn have pretty much said what I was going to say with respect to PyMite and CPUs vs. ROMs.

Personally, I think you could get quite far with a minimal Linux distribution and user space environment. It's useful to have some kind of operating system underneath to abstract away the less interesting details of the hardware. You would still have access to the Linux framebuffer device, so you would be able to take advantage of the standard way of interfacing with it without having to care about what the display hardware is, for example.

However, I wouldn't underestimate the usefulness of specialised CPUs. There are some pretty smart people out there who find working with things like FPGAs to be second nature, so who knows what kinds of hardware support might become available in the future.

Comment #10: Posted 2 years, 2 months ago by Adam Skutt

Specialized hardware is great, but only if you need it. More importantly, FPGAs tend to be used (traditionally) for tasks where CPUs are a poor fit.

Cost drives everything in embedded systems. If you can perform the same task using programmable logic or software, you'll almost always end up choosing software because it's cheaper. Programming logic is only used where a CPU isn't up to snuff performance or power wise, or where a CPU is physically incapable (e.g., because you're building an I/O bridge between the CPU and some other peripheral that talk different interfaces).

Comment #11: Posted 2 years, 1 month ago by David Boddie

I completely agree with you, Adam. However, I have to admit that I was thinking of experimental systems rather than embedded systems when I mentioned FPGAs, so from more the perspective of, "Can it be done?" or, "What would a Python CPU look like?"

I was inspired to think about these things by the Gameduino shield for the Arduino platform whose coprocessor is basically a small Forth CPU: http://excamera.com/sphinx/gameduino/coprocessor.html

Of course, I'll be hacking on software for the forseeable future. I don't have the skills to start doing hardware like that!

About Me

My Photo
Names Kevin, hugely into UNIX technologies, not just Linux. I've dabbled with the demons, played with the Sun, and now with the Penguins.




Kevin Veroneau Consulting Services
Do you require the services of a Django contractor? Do you need both a website and hosting services? Perhaps I can help.

If you like what you read, please consider donating to help with hosting costs, and to fund future books to review.

Python Powered | © 2012-2013 Kevin Veroneau