Friday, 23 May 2014

Graphics are abundant.

Graphics. Graphics graphics graphics. Graphics graphics graphics, graphics graphics. Graphics.

This is a rant about graphics. This is a rant about the gaming community. I am not gonna tell you that graphics don't matter or graphics don't matter that much as gameplay does. You've heard that already and some people surprisingly don't like it! But I want to review a bit the craziness about graphics, how insane it became since the last gen (PS3, Xbox360 era) and rant about how it affects the game industry.

You would argue: "But hey dude, wasn't it always like this? Didn't we have the same discussion since the 8bit era? Wasn't this always what some gamers craved for?". Yes. Although in the past things where more pure, graphics never reached the kind of realism that is a game changer in the industry. Graphics have become so amazingly good that people are blinded by them more than ever. Unless because there are a lot more gamers and game forums and everyone is whining about this, then it's easy to see the absurd.

I was curious about the new Wolfenstein game (which I preorder just for the Doom 4 beta btw :). And so I was looking at various gameplay videos and at one point I dreadfully decided to read the youtube comments and see what people think about the game itself. Did they liked it? Was it too short or too linear or too much of the same typical FPS? And in my surprise, a lot of the comments where like "Why aren't the graphics next gen? Why are they so bad? Bad graphics = bad game!". I was like "What the hell are you talking about? Did you like the game? Would you suggest someone to buy it?" and the replies where graphics, graphics, graphics, graphics. Whole paragraphs whining about how bad the graphics are (which they weren't, they were not something extraordinary not seen before (aka nextgen), but where really really good (more than enough, abundant) and did the job) and concluding that the game is bad, without a single mention on gameplay (or even sound, story). Maybe those graphics whores always existed, we just didn't have youtube then. But it just strikes me. People are getting crazy about not being 1080p (I guess they bought the PS4 or Xbox One just for the graphics) and at the same time making fun of the WiiU with it's "cartoonish" graphics. It's all that matters to them.

But you know, I was thinking, how does this affect the gaming industry? More development time spent on perfecting the graphics and less on gameplay. You will argue that it was always like this and I wouldn't disagree. But things have evolved and now in order to aim for the state of the art you need more effort and resources. I was reading a discussion that maybe games focusing on multiplayer are killing the single player experience because devs are focusing more on the MP part. And you have "campaigns" (as they call them now) that are 4-8 hours of gameplay. But I think the reason is the huge amount of work a team needs to do to make a level map. Think how much easier it was for a game modder to make a Doom map (even better a wolfenstein 3d map) and how much more time and effort a level for a modern engine needs (unless you want to do something blant that looks like PS1). Hell, some of the games are even like hollywood movies! Notice some trends that got popular since the PS3/Xbox360 era. Quicktime events. Games that are basically interactive movies, beautiful graphics yes, but you just have to press X, O, square or triangle to proceed. Heavy Rain, Two souls and such. Some games have partially short action gameplay and at frequent intervals quicktime events. I fucking hate that! But it's perfect for graphics whores and people who bought one of these two consoles just to stare at the graphics. That's why we don't get a really good, challenging or lengthy single player experience (besides few exceptions) but most gamers are sattisfied by half-ashed interactive movies with glorified graphics. I don't see many people crying about this, and maybe that's what sells.

But at the same time I realize. Graphics are abundant. This is an important realization. Just right now, chose a random game of five years ago. Serach for the best games of some year in the past. 2008? 2005? I don't care. Some of you might think that the graphics are outdated but I digress. They are more than enough. Graphics have progressed so much that if you go back two generations they are still really lovely to stare at and do their job more than good! Yes, you might not see nextgen, they might be outdated by today's standards, but they are not dated. Maybe the first few generations of 3D graphics where too crude to look at (but I will still appreciate and play some) and you can see that now if you go back, wondering how you were playing with such low detail (especially evident in N64 with the really lowres textures and filtering was making them more blurry and hard to see at distance). But I can stare in a later generation few years ago and marvel. And that is beautiful and nice, how much we have progressed. So much that maybe few gamedevs will decide to focus more on the gameplay content since the engines we have now are really doing a good work. But will they follow this road? The problem with game whores is that they want to see more and more of the progress. They bought PS4 not to see what we already can see on PC, but something new never seen before. But graphics are there at the edge and sometimes they are so loaded, like you have games with tons of DOF, glow, bloom, HDR, smokes, particles, shadows, whatever that you can't see anymore what's on the screen. All this drowning of the visuals in graphics seem like more and more heavy high tech shit to bat shit crazy graphics hogs.

And I am not even sure if most of them know what lies behind these engines. Most of them have learned to stare at visuals and argue whether they are better or not by single glances and intuition. They don't know the underlying algorithms and tech behind what they see (and since modern engines are drowned in too much features, even I can't tell you if one engine is richer than another) and so they haven't learned to appreciate and understand how amazing current engines are. They crave for more and more and appreciate less and less.

Hell, the funniest thing is that I am a graphics programmer and yet I don't get crazy about that stuff and I may even enjoy old games with "horrible" graphics. How is it possible? I should be the one becoming nuts about graphics! Yet I don't. Another joke, I recently met with four programmer friends and I realized again that most of us carry an outdated mobile phone (some of them are not even smartphones) and it seems counter-intuitive to the common person. How are you such a great geek and yet don't crave for the last gen? I see mostly non-programmer friends of mine speaking about the latest fad in mobile phones, while I get bored of the discussions or can't relate. See facebook. Someone would wonder how am I not really into facebook, since I would be the first person (as a geek) they would expect to be into it. Most of my geek friends despise facebook. Why do we differ? Fed up that these things that once where our world became so mainstream? Are we just hipsters that want to be different again? Or do we appreciate even the old technologies because we have a better idea of the internals? I am not sure. If you asked me, I wouldn't deny that I sometimes feel like a hipster, because everyone gloats about his mobile phones, but only me can program some good stuff for it (something that seems magic to most). But I also don't feel the need to really hook to the latest next gen fad for whatever reason. I will not feel like a lamer if I play some retro game on Amstrad CPC with "horrible" graphics, since I have an idea about the hardware and restrictions of the machine and thus I can enjoy playing it, being amazed on how they succeeded some of the stuff, even looking at it historically. A waste of time for some. As if they count good times with whether they were the first to play the most next gen possible, looking at the future, gloating at their star trek device (mobiles) and feeling modern.

I won't hide it. I feel like a hipster sometimes. I feel like they invaded my world. I stare at the mainstream and despise it. I think I am better because I code. At least I know my shit. I can appreciate the technology. I was there! And sometimes I cringe at their reactions. "Games are not mostly about graphics? How can you say that! Graphics is the first thing about games, it always was..". I laugh at their general direction.

p.s. Most of all I laugh at the new generation of consoles. PS4 and Xbox One, they are basically PCs if you check the specs, both AMD and at par with my current PC. And check most of the titles of this or the last generation. Titles ported from PC. I wasn't interested in PS3 or Xbox 360 at the time (except for some exclusives that never came to PC, maybe good reason to get a cheap ass used PS3 just to play these) because they were basically what the PC had to offer. It seems the same today. At least the Wii-U offers a unique hardware, a controller that many hated but may give a unique experience to some games using it well and the usual Nintendo exclusives I can't find on PC (Can't say it's original, yet another mario, yet another mario kart, Zelda, etc) so it's a lot of incentive for me to buy (when it drops, lol). Hell, the older generations of consoles where more interesting (even if I was never a serious console gamer, besides handhelds). I am collecting them for fun now. Each of them have a unique hardware, some had just a quad rasterizer (Saturn, 3DO), others more complete 3d hardware (PS1, N64), Jaguar had it's obscure programmable chips, even PS2 or Dreamcast had unique architecture, not a generic PC put in a box as of today. And much more of the games were exclusives. Each hardware was unique in the past (also home micros), something that is missing today. Console gamers are ranting about graphics, so I ironically become a hipster and tell them why don't they just get a PC?

Wednesday, 28 August 2013

Gaming girls, nerd culture, who gives a damn?

Just an entertaining thought I recently had: We go berserk about "OMG, a girl who can play games" or something. But today is the day of gaming and being "nerdish" and stuff is mainstream. It's no big deal. You can easily find girls in front of computers, playing games and such, or the other mainstream lamebook stuff. Or in my home country where the netcafes where crazy places like arcade rooms, not what you think, it was new years eve and you saw people on netcafes and that was weird. And really good looking girls too. It's not something crazy today. It even becomes mainstream.

And the full aspect of the entertaining thought was: Why are we getting surprised that someone can play a game? What's the fuzz of "Omg, I play zelda, I am such a geek!"? You are doing the easiest thing in the world!!! The easiiiiesttt thing in the wooorld!!!!!!!

You know when I play games? When everything else has fallen apart! When I can't work, I am too lazy to do house work, even too lazy to go out for a walk. There are many distractions and those have the form of the easiest falloff, like the path of the least possible energy, where things most easily go when you don't try at all. And gaming is one of the easiest paths of all! I am not saying that we shouldn't game. I love gaming! But it's just funny to think "OMG, that girl is a gamer, this is ingenious!". It's funny to be surprised. Anymore..

And that's the other entertaining fact there: Why, oh why, years ago it really was that special to even find male people into just even gaming? I mean,. it was so rare, that you could consider some writer on a gaming magazine like a god (while I would have considered this for some early programmers, having admiration for gaming gods is hilarious). Because nobody was into it! It was just so rare. You didn't need to be a coder. Being a gamer and maybe also fix other people's computers (No, I am not gonna fix your computer t-shirt comes in mind :) and you were a guru.

But why? Why didn't people got into gaming then? I mean, even just gaming. Is it because of non user friendly DOS and memory managers and IRQs and stuff? Maybe,. but it doesn't seem to me like a big deal. Ok,. things where harder, but games were always awesome. But you just had to learn how to setup your autoexec.bat and config.sys and that's it! No more brainiac stuff. I do believe it's the cultural thing mostly. It was uncool. It wasn't mainstream. If you were into it, you were either jokingly called a hacker by ignorant people or most of the times a nerd or geek in the negative sense.

But it just entertains me. I had something about coder girls and it's still inspiring but they do exist and why not? But gaming people being something worth attention? The easiest thing in the world!!! The last resort of my laziness..

And then I was inspired by this one, says about culture and stuff although I didn't felt that bad about Big Bang Theory, and also I didn't exactly agree with his Borderlands review (Oh,. I didn't know though Jesper Kyd was behind the music. And big surprise, the art style and theme and even scenes are "inspired" (ripped? who cares..) from some unknown (to me) anime, codehunters. Wow! That's so close!

p.s. And I really like that meme!
p.p.s. Don't misunderstand, I don't want to be mean to girls into our hobbies though. This post is not about that..
p.p.p.s. If you love what you do, just do it. You don't necessary need to identify yourself with a title. But we do need that sometimes, eh?

Friday, 26 April 2013


I am usually mad at common misconceptions that have somehow won people's opinions that they sound like self-evident. Ideas that you hear everyone reciting and just because everyone is saying that, it must be true. It's annoying how self-evidently these ideas are recited without trying to understand what they mean and how accurate they are, how strong they become because they are bound to be so now that everyone says so. And while they might have some merit, they don't feel absolutely so true if you try to make sense of them.

One common such idea that I am too tired to listen every once and then, when I happen to discuss something with a friend and suddenly mention "Oh,. I also read in wikipedia that blah blah...". And then he responds instinctively the same parroted words: "Wikipedia is not a reliable source, it's inaccurate, anyone can edit, blah blah..". Arghh!!! And what are the sources that you are proposing then? The local library? The university? Someone at NASA? What are the sources about specific subjects that you don't have direct access, and your best friend for that is the internet?

It depends on the subject. Of course I know the saying that you should not reference wikipedia in a scientific paper. Even wikipedia says so. Wikipedia articles are just a simple start up with a subject that also cites the original sources used. But the classic saying, "wikipedia is oh so faulty" occurs so robotically, like a Godwin's law, even when discussing about trivial information that are easy to verify or insignificant claims. And the absolute way they recite this saying gives you the impression that wikipedia is an extremely bad source of information, one of the worse on the internet. Which is not true, quite the contrary. Wikipedia is for me one of the best sources on the internet!

Why am I saying that? Whenever I hear about a new term, organization, person or anything I immediately reach wikipedia first. One would say, that is because it's good head start the easy way on a subject and then one can continue with linking to the references. One reason is this but there is more juice. Sometimes you can't be sure about something from different sources, because each source is opinionated or has different agendas. Wikipedia is maintained by thousand of users, each obsessed with different topics, trying to be as insanely correct as possible about each subject. Which of course can also lead to quarrels too, but it's resolved by several people and their different views might be blend (or not). It's as good as it can get given the nature of the internet. And sometimes you get a broader and more complete article where it says "The definition for this is A, but it's disputed by some other people, while that guy said it's C [citation needed]".

Search for example for the definition of hacking for example on the internet. Confusing! Information wants to be free, mentor's manifest, revolution,. no script kids they are, but there are true hackers, the others are called crackers,. confusing opinionated things, each site tells a different story. And that's where the wikipedia article shines! It says something like "1) Used to be the MIT guys, or a state of mind of creative people,. 2) then the computer hobbyists, 3) now the computer network intruders by the media,. etc. And these guys dispute the definition, while the other guys said about this, etc..". And then the history and then all the whole bunch of links and references. At least you get the whole image, what it is, what it was, what are the different opinions, how it changed through history, in a neat article that tries to be not too big yet covers the most important information for the basic understanding of the subject. Another example would be to learn about a company or organization you just heard about. Would you say that the best plan is to visit the organization's website? Of course, at first, depends on what you are searching too. But what if you want to have a more spherical objective view, rather than only positive promotion on the company's website? What if what you are searching is not "We have the best products, 20 years of positive experience, the happiest and most productive personel, blah blah" but raw words like "This company was started by that guy at 1980, in 1990 they produce that but didn't sell well, at 1998 there was a scandal with the CEO, blah blah". This is what I like in wikipedia! A good head start, raw words, factlike, no agenda towards only one direction, different views presented in a neutral way and additionally references to external link if you want to verify the facts.

When most of our sources come from the internet today (depending on the subject of course) and wikipedia has all these qualities (and if it doesn't, at least it's a good starting point) how can you say that it is a bad source? Then most of the rest of the internet are worse. Unless you believe that the internet itself is a bad source of information. Says the person who uses it everyday to read or spread information. How generic is that? The internet is you and me and everyone of us. It's like saying that people and their sayings are a bad source of information. Then what is left? This is the same as that old again robotic saying that users in Pouet are 95% illiterate and have nothing to do with the scene? WHAT? But Pouet is the ONLY such big demoscene community site where people come together, with discussions and everything! Pouet IS demoscene on the internet (but not in real life). Why do people say such things from mouth to mouth without thinking a bit? How can 95% at Pouet be irrelevant with the scene? And are the people who do these claims in the 5%? Then what the hell are they doing in a site that is 95% non scene related? I know that more than 95% who frequent there have produced something for the scene. But 95% of people kept reciting the same old song :P

Don't misunderstand me. I am not saying that wikipedia or the internet is the GOD. I am not saying it's 100% infallible. It's just as right as humans can be. When I am mentioning that I read something on wikipedia, I am not saying that it must be true. I am saying that I read something in some source (being the internet or not, it's irrelevant) and wondering what the other person has to say about this information. The stupid way would be to answer "The information is bullshit, because wikipedia is unreliable" and the more mature would be "I think this information is not correct because of this and that reason, regardless where you read it". Hell, it wouldn't even make the information more reliable even if I heard it from a person of high authority. It's like the old quarrels in debates, "I have 2 masters, 3 PHDs, I am the best in my domain, so you are totally wrong and stupid!" :P

Classic fallacy I guess, blaming the messenger instead of trying to understand and verify the message.

Sunday, 20 January 2013

GCW Zero rants

So, it's going very well now, after the upgrade of the hardware specs, a sudden spark of bids that I cannot explain came because for me the additional ram and storage is not very important but cool anyway. The biggest thing would be an upgrade of the LCD. Although I am used with 320*240, higher resolutions would be interesting to code demos and see how they perform. But not entirely necessary for the gamepark/dingoo community. I think another thing that is missing is a touch display, maybe only important if you want to properly play ScummVM or specific touch games I liked from my Caanoo. But anyway, you can't have anything and any change on the LCD might need more time redesigning the hardware, writing drivers and more costs too. As my latest favorite homebrew handheld was the Caanoo, it will fill like I am missing something, even though most software didn't make use of it.

Many argue that the resolution and memory was such a put-off, also the absence of a second analog stick. As I am coming from the gamepark/dingoo communities where even 64MB were enough for emulation I don't get the feeling that 256MBs are few. The first edition of Raspberry Pi had 256MBs of RAM. If you see the specs of many later or even modern game consoles, you will be surprised at how little RAM they had. One argued that indie devs won't optimize their game and 256MB would be too few. Well, if you have seen the quality and scale most of the homebrew games, they are very basic 2d stuff and they already played well on the old 64MB devices. Quake 3 needs 64MB Ram. Who is going to write such a big title on GCW? Even PS3 had 512MB ram, with the GCW spec updates we are up to par.

I think the resolution can be more put-off, it's one of the specs that actually never changed since GP32! Yes, the 800*480 of the Pandora would be a great asset, but I am personally content with what it is and for me it's like a more powerful Caanoo where finally the community gathers again. I know there are chinese android devices more powerful and cheaper than this, but those are released as new models, they have not a concentrated community (they do have the android market, but not the feeling of a specialized community coding stuff for exactly a single one handheld). As for the resolution, did you know that your Nintendo DS has 256*192? Just like my Spectrum 48k! And the games were awesome!!! We just don't realize it because the screen is small. Graphics looked so cool with the 320*240 resolution in all the past handhelds. Little problem for some emulated hardware that sometimes use higher resolution modes (SNES has higher resolution modes (512*240 and some interlaced), PSX has a 640*480 iirc, even my beloved Amstrad CPC has 640*200 in Mode 2 and goes way more with overscan screen). Those are rare cases though and I think I have still enjoyed some Mode 2 screens on a CPC emulator on Caanoo already. Maybe a higher resolution would be handy with web browsing, though this is not the main reason for buying the GCW0 (I just bought a Google Nexus 7 tablet for that).

As for the second analogue stick? Sorry, I am not into XBOX/PS3 FPS games, I am not gonna play modern FPS with this device (I would prefer mouse and keyboard controls on PC), I mostly liked the ports of PC fps on such device as a technical achievement rather than something I would play. And to tell you the truth, especially old FPS like Doom or Quake played well enough even in my GP32 with only two fire buttons on the right! Or so I think..

I guess the GCW0 is for a specific kind of people who already understand what the machine is meaned to be, who know the history of the open handheld communities from gamepark/dingoo times, and want something new with a strong community, for emu/homebrew gaming and retro game programming too. At first I was afraid this kickstarter would fail because there are few homebrew fans out there who would distinct this from your average android handheld (and the lame Neogeo-X who just has publicity and is the same chipset as GCW0 I learn, but locked to play only 20 Neo Geo games, overpriced too) but I see the reality is different, the support is great and hopefully this will live for long.

Maybe if this is successful enough and the creators want to go for a GCW1 then we might see a higher res and touch screen (which is the only thing that would be a big up to me right now) but let's see this one first develop a great community, might make me want to go back to coding something new for this device instead of just ports of my old demos. I'd like to make a game..

Thursday, 10 January 2013

GCW Zero

GCW-Zero: Open Source Gaming Handheld -- Kicktraq Mini

Finally! I really hope this project goes well. I backed up the 160$ (+ 20$ post) because I really want this one and it was little above 135$ (the least you can give to also preorder the handheld). I know, too much you will say. People tell me, why this when you can get a cheap phone? Because I hate touch. I can't play emuls with touch. Oh,. but you can add that hardware joypad that attaches to phones. I am not going to carry another controller among my phone with me!

I am talking about the niche category: homebrew handheld from community, not even commercial one that might be able to run homebrew (They tell me PSP is enough for emulation why do you need Caanoo? I haven't unlocked my PSP yet because it's too much work, my BIOS was updated and I have to find another one with broken PSP or something, fuck that shit!)

And I am talking about a homebrew made by the community supported by the community. I know there were some cheaper chinese android handhelds that even played N64 well, but these were not supported, nobody knows them, the chinese company makes many series of them instead of concentrating in a single one.

We needed something like a successor to Dingoo/Gamepark. Something for the homebrew console community (not touch, not android, just classic gamepark style!). Pandora failed to deliver and was too expensive. But now we have GCW Zero!

Some info
Kickstarter link

p.s. Once ago I wondered about the community and whether it would be nice to have an appstore. Well,. gamegadget (regardless if it was as powerful as an old dingoo) didn't go well (and I've heard the company didn't behave well), nD is for laughs (where?), so forget all that stuff and have the GCW Zero from several members of the community (see the kickstarter for a list). This is a more serious attempt. Scene gonna move!
p.p.s. Other good news. My old classic GP32 had died in 2008. I just decided to buy one used from Ebay. Not telling you how much I payed. But I really miss the nostalgy and there isn't even a proper emulator. I am just waiting for it!

Thursday, 15 November 2012

Old computers are not that junk as you might think

It happens that I have a quite different perspective about old computers. I like to stretch my feeling of what are the true capabilities of a very old computer. It's not nostalgia, it's not being fanatic, it's just a feeling that I love. That of seeing an old computer that people consider extremely slow, do really cool stuff on the screen.

And yet again, I have a much better appreciation of what old hardware can do than most people. Not just as a programmer but as a user of retro PCs too. There is that bias, where people observe what their current PCs can do and how performance consuming it is already, that makes them extrapolate that hardware from even 3-5 years ago would be extremely boring and you wouldn't be able to do much with it.

Well, the problem is that most software doesn't exactly represent well what the true power of older machines is. If you try to run the latest software in PCs of 5 years ago then yes, everything will struggle to a halt. And maybe the web is becoming more consuming with it's heavy loaded websites full of javascript and adds that it could really be a drag to try surfing the net with a Pentium 3 or 4. I don't know, I haven't tried this one. But the fact is that you can always find software written during that time when Pentium 3 was common and discover that you could really do lot's of cool stuff at that time.

An example about how extrapolating would make you think a 386 is a fucking calculator is this. So, I have my ultra new dual core PC and I bought a new graphics card. I install it and then there is the moment where I want to uninstall my old drivers and reinstall the latest ones. During that limbo time when there are no drivers install, maximizing or scrolling a single window, you see the very slow redraw of the window in realtime. Why I'd say? Many people would say it's the 2d gfx acceleration that makes it possible which are not there with no drivers. It could be. But so slow? I remember at times with my 486 and a crappy S3 Virge that redrawing was fast enough. One would say I was then with 800*600 and 16bit color while now 1920*1080 with 32bit color and he wouldn't be entirely wrong. But the message is that: A dual core AthlonXP without hardware acceleration draws windows slowly, so what would a 486 with an S3 Virge do if we extrapolated? Maybe they were still living in DOS? GUI in 486? No fucking way!!!

Or what would they think about the idea of a windows like OS running on an Amstrad CPC 8bit computer? Not even with bullets as we say in Greece :). Like maybe waiting 10 minutes for a window to redraw? Yet, those are the true stretched capabilities of what a CPC can do: SymbOS video (I know it's using CPC T-rex with Z80 at 24Mhz, but I couldn't easily find a good plain CPC SymbOS video showing enough features, though I have witnessed it myself and it's not much slower). Now, extrapolate to the opposite direction. If a CPC can do that, imagine what could a 286 do!

I know I know, a CPC has 16kb videoram while 286 has 64kb in plain VGA 13h mode and as the systems improve they add better resolutions and color depth which need faster gfx cards and CPUs, etc, etc. Maybe that's a good argument for the negative extrapolation, like yes the CPC had only a Z80 at 4Mhz but didn't need to output 8MBs of video memory during each frame as my 1920*1080*32bpp screen needs. And neither accelerated functions on the graphics card.

Sometimes I like to extrapolate in the positive way with my own projects. I have coded enough demos on Gamepark devices with ARM7 processor at 100 or 200Mhz or even the latest Caanoo at 533Mhz (my unit can overclock at 800Mhz). Of course, I know that Mhz is not everything and memory bandwidth or small cache can bring those things down a lot, but the point is that I can't help but admire those Acorn Archimedes demos I have watched again and again in an emulator (especially those from the group Xperience) where there is either an ARM2 at 8-12Mhz (old models) or later ARM3 at 33-40Mhz and yet you can see good 3d or 2d effects in resolutions similar to the Gamepark handhelds (ok, but their color depth is 8bit and not 16bit like the one I used). I know how hard it is because I tried to write some pixel per pixel effects on the GBA with ARM at 16.8Mhz where the resolution is ever lower (240*160) yet it took effort to even run a simple 2d effect in something like 25fps. I believe many GBA coders might optimize their routines with ARM assembly and I know that the XPerience group did that too. Now, my gamepark demos were pure C and I never felt like needing to optimize with assembly at the time I was writting my demos because the frame rate was already high enough at least for most of my effects, but I would love one day to try some ARM assembly (which I've heard is quite fun) and see what those beasts could do if their powers were harnessed!

Yes, I think of these devices as beasts. I think that even the old GP32 with ARM at 100Mhz (which I overclocked at 156Mhz most) could easilly be like a very fast 486 or early Pentium, excluding the FPU of course. Those monsters have a higher ratio of cycles per opcode (they said the ARM2 at 8Mhz is has 4 times more MIPS than 68000 at the same Mhz and I have witnessed it in some videos) than older x86 revisions or Z80 for example. Yet check the other fact: My smartphone is a slow one. Yet it has an ARM7 at 600Mhz similar to my Caanoo. But Android is a crawl (cause it's based on Java?) and unresponsive here. I have seen faster phones that can handle it, some with dual core processors. But it's a beast, I know that an ARM7 at 600Mhz is a beast because I consider even the GP32 with the 100Mhz ARM as a beast. And I've watched plenty of archimedes demos and tried RiscOS on a 33Mhz (emulated) computer. If I would extropolate from my latest smartphone running Android backwards I would feel like my current phone is a big piece of junk (which I do feel, judging by the unresponsiveness) but I know it isn't. I can appreciate what is in there and what would be really possible.

And then I'd like to just add to this conversation the counter-effect of false retro memories. Sometimes we remember our old computers as faster than they really are. Maybe because we were used to those speeds then and after coming back from the future it feels different. I was transferring some ZIP or ARJ archives of old games to my 386 (disc by disc, ugh) and then I decided to decrunch some of them. My initial reaction was: "What? I never remember decrunching being so long on my old PC! Was it really that slow?". Either I had false memories or being used today to instant decrunching times (even with larger archives) it felt so odd.

But it doesn't change how I feel with old computers. I don't look at them like pieces of junk that are as fast as pocket calculators. My friends look at me coding for old computers and wonder "A 386? What can you do with it? If I had only this I would be bored to death.". Hell, they even feel like this about a Pentium 4. But few years ago we used to work on these machines and we thought a Pentium 4 was like paradise, you could run the latest games, surf the web, watch movies, etc. The other thing here is that I don't see computers as media devices. I see them as things to experiment with, to program, to make them do whatever I like them to do. I feel the essence of code and creativity. Slow processors and primitive graphics don't bother me. I am so much happy even with the oldest thing I can program.

Some people might see random ugly pixels on a CPC and miss their HD media players and modern 3D, but I get enjoyment from deciding how to makes those pixels blink on the little CPC screen. And I discover that with some clever optimization those little 4Mhz can do a lot lot cool stuff! And If I am already so positively pleased with the feeling I have about the performance and creative possibilities I can harness from CPC, imagine how much more this is extrapolated for my 386 or the beasts we carry with us today. It feels so great, like I would never get bored of computers in the way I see them even if production of better computers would get into a halt. There are still a lot to explore even in computer communities where you think they have done everything and there is nothing more to see (just check the C64 scene, every year you see something that makes you think that this is the end, they have exploited the machine to it's full potential and yet you see new more impossible things few months later).

Creative use of computers makes this feeling possible. I could never enjoy a computer so much just from the mindset of a consumer.

p.s. I got into writing this post after being inspired by a twitter message about how angry the author was with people underutilising the CPU and then saying "let's do it in GPU, CPU is crap". Come on people, CPUs are huge beasts already! (well, I know GPUs are much greater beasts, though not as multi-purpose)

Tuesday, 31 July 2012

Bitness of a CPU

For yet another time, I've started asking questions to some local geeks about what defines the number of bits of a CPU. I was puzzled again because I have read different answers about the definition and unexpected values for specific CPUs. Then I decided to investigate this matter more, aka doing my homework (I won't sleep tonight :P).

The most common two opposing metrics are:

  • The size of the registers define the bits of a CPU
  • No, not only the size, but also the data bus width

Some considerations:

  • Is a Z80 a 16bit processor then? It has 16bit registers! No,. those are "fake" regs created by the 8bit pairs. Ok,. so it's the size of "true" registers. Question: How to realize that they are not real regs? More cycles needed? Very few arithmetic operations on the pseudo 16bit regs? 8bit internal bus?
  • When we say data bus what do we mean? In the second metric above, most people mean the external data bus which means communication between CPU and memory. When that argument arises they mean that one.

The chaos of terminology:

  • I tried to research into this by reading what are the definitions of terms like data bus, internal/external data bus, etc in wikipedia. I need to know the terminology so that I can understand the different arguments better.
  • In wikipedia article about data bus in the definition of internal and external bus I read this:  "Internal bus, also known as Internal data bus, memory bus or system bus or front-Side-Bus, connects all the internal components of a computer, such as CPU and memory, to the motherboard"  and  "The external bus, also known as expansion bus, is made up of the electronic pathways that connect the different external devices, such as monitor, printer etc, to the computer.".  Might be correct (what do you say?) but totally inconsistent with the way this terminology is used on the bitness arguments.
  • Although in the usual CPU arguments about bitness, people always mean by the external data bus, the CPU<=>Memory communication and the internal data bus probably the internal communication inside the CPU (between regs and ALU and who knows what other stuff). Also the same meaning is used in wikipedia articles describing every CPU. But in the data bus article, internal bus is all those stuff together and external bus is something probably irrelevant with the bitness of the CPU.
  • To make things more confusing, sometimes when arguments arise the don't even specify whether it's internal or external, for example "You are not right, this CPU is not a 32bit but it is a 16bit because the data bus is 16bit only". Which one? (I can only deduce that most of the time they mean the CPU<=>memory one, because in the vast majority of CPUs it happens that Regs size ==  internal data bus)

Other less common metrics I have heard:

  • ALU bits. It can mean many things (I haven't understood entirely yet)
    • Internal communication speed between CPU and ALU. In few words, our classic internal data bus definition (by people on the argument, not the wikipedia article I linked). Usually it occurs to be at the same bits as your CPU regs.
    • Do most standard arithmetic operations exist for those bits too? Example: Z80 with it's pseudo 16bit regs can just do ADD HL,DE/BC but not most of the other logical or other operations, so it's not 16bit.
    • Calculation bits per cycle! Is Z80 spending one cycle in ALU to do 8bit arithmetic calculations? I just found from two different random sources (just forum discussions though but nothing more official yet :P) that the ALU of Z80 worked at 4bits, that means 2 cycles are spent for a single 8bit add. Someone joked that if taken this metric in account, Z80 would be a 4bit processor and that made me wonder WTF?
  • Address bus width. Of course this is absurd and nobody brings it into argument. Yes, the Z80 and 6502 have 16bit address because they need to address 65536 bytes of memory but are they 16bit? Of course not! The classic PC segment/offset style was 20bit, does it make the 8086 an 20bit processor? Nope! Though, classic wikipedia articles on bits, like for example the 32 bit article say this:  "Also, 32-bit CPU and ALU architectures are those that are based on registers, address buses, or data buses of that size."  Of course you could say that it says that they could be based, not should be based, though it could be misleading for someone. Anyway, I am just mentioning this term too to exclude it from the bitness argument.
  • Another confusing thing is the bitness of the machine. Of course this doesn't make sense today and it was a marketing gimmick sometimes. For example Atari Jaguar. They said that it was 64bit and yet I know it had an 68000 as main CPU. Though I recently read in the Jaguar wiki page that among the 68000 it also houses several programmable graphics chips with 64bit architecture (does not define which CPUs) and the 68000 is just the chip that manages all these. Even the external databus between  those chips is 64bit I read now. Pretty strange architecture I'd say. But anyway, the bitness of a machine is a different and more subjective and not our current subject which is bitness of CPU.

So suddenly I read strange things about CPUs:

  • The 68000. I always thought of this as an 16bit CPU (maybe because people associated the Amiga or the Mega Drive as 16bit, you know the next generation/marketing confusion (counter argument: but why not market them as 32bit and be more impressive then?)) but some people refer it as 32bit. I read that it certainly has 32bit registers and possibly(?) 32bit ALU but surely the external data bus (CPU<=>memory) is 16bit and in one wikipedia source I've read that even it's internal data bus was 16bit. Most sources I've found distinguish it as a 16bit/32bit processor because of that (the 68000 wiki article cites this to the Motorola MC68000 Family Programmer's Reference Manual).
  • In contrast, the 386SX  is fully 32bit internally and with 16bit external data bus  (which is the crippled down version of the 386DX that was 32bit in that aspect too) yet I haven't heard anyone consider it a 16bit. In that sense we have another metric which says: Bitness of a CPU = Regs size & Internal Data Bus (and external data bus doesn't count).
  • Same happens with the 8088, which is a crippled down version of 8086. All is 16bit except the external data bus again which is 8bit. At most places I have heard it's a 16bit CPU except from few where I see the 8bit/16bit distinction.
And to finish my little big post, I'd like to say that I just wanted to put things in perspective, so that I can collect all these conflicting thoughts together in this post for a better overall view and maybe also read what other people have to say (if anyone ever reads this and decides to post anyway). I am not a maniac about the number of bits, I don't want to defend my favorite CPUs or something, it's only that sometimes I get too obsessive about discovering whether I can extract few certain facts from tons of conflicting opinions on the internet. If you go and read a single forum where they discuss bitness, you are never sure about the right definition, but if you search more sources and more discussion forums you start creating a more spherical view which might still be conflicting but at least you can collect these things together and try to make sense of them. I am just obsessing about making sense.

Anyway, even the CPU metric is sometimes subjective and as someone said, who cares about bits? Just program the damn thing! Z80 with it's fake 16bit regs is still an 8bit CPU (I never doubted it) yet you can do miracles with those additional opcodes (8-8 fixed point additions with quite less effort and cycles than on 6502) so it's still an advantage. And maybe instead of labeling a CPU with single bits, describing that it has 32bit regs, 32bit ALU, 32bit internal bus and 16bit external bus is a better way to decide for yourself. Or the more complete 16/32bit labeling would suffice. Or in words say: This CPU is 32bit internally but 16bit externally. Makes more sense.