Wednesday, 28 August 2013

Gaming girls, nerd culture, who gives a damn?

Just an entertaining thought I recently had: We go berserk about "OMG, a girl who can play games" or something. But today is the day of gaming and being "nerdish" and stuff is mainstream. It's no big deal. You can easily find girls in front of computers, playing games and such, or the other mainstream lamebook stuff. Or in my home country where the netcafes where crazy places like arcade rooms, not what you think, it was new years eve and you saw people on netcafes and that was weird. And really good looking girls too. It's not something crazy today. It even becomes mainstream.

And the full aspect of the entertaining thought was: Why are we getting surprised that someone can play a game? What's the fuzz of "Omg, I play zelda, I am such a geek!"? You are doing the easiest thing in the world!!! The easiiiiesttt thing in the wooorld!!!!!!!

You know when I play games? When everything else has fallen apart! When I can't work, I am too lazy to do house work, even too lazy to go out for a walk. There are many distractions and those have the form of the easiest falloff, like the path of the least possible energy, where things most easily go when you don't try at all. And gaming is one of the easiest paths of all! I am not saying that we shouldn't game. I love gaming! But it's just funny to think "OMG, that girl is a gamer, this is ingenious!". It's funny to be surprised. Anymore..

And that's the other entertaining fact there: Why, oh why, years ago it really was that special to even find male people into just even gaming? I mean,. it was so rare, that you could consider some writer on a gaming magazine like a god (while I would have considered this for some early programmers, having admiration for gaming gods is hilarious). Because nobody was into it! It was just so rare. You didn't need to be a coder. Being a gamer and maybe also fix other people's computers (No, I am not gonna fix your computer t-shirt comes in mind :) and you were a guru.

But why? Why didn't people got into gaming then? I mean, even just gaming. Is it because of non user friendly DOS and memory managers and IRQs and stuff? Maybe,. but it doesn't seem to me like a big deal. Ok,. things where harder, but games were always awesome. But you just had to learn how to setup your autoexec.bat and config.sys and that's it! No more brainiac stuff. I do believe it's the cultural thing mostly. It was uncool. It wasn't mainstream. If you were into it, you were either jokingly called a hacker by ignorant people or most of the times a nerd or geek in the negative sense.

But it just entertains me. I had something about coder girls and it's still inspiring but they do exist and why not? But gaming people being something worth attention? The easiest thing in the world!!! The last resort of my laziness..

And then I was inspired by this one, says about culture and stuff http://www.youtube.com/watch?feature=player_embedded&v=H7A5OgfP4NA although I didn't felt that bad about Big Bang Theory, and also I didn't exactly agree with his Borderlands review (Oh,. I didn't know though Jesper Kyd was behind the music. And big surprise, the art style and theme and even scenes are "inspired" (ripped? who cares..) from some unknown (to me) anime, codehunters. Wow! That's so close!

p.s. And I really like that meme!
p.p.s. Don't misunderstand, I don't want to be mean to girls into our hobbies though. This post is not about that..
p.p.p.s. If you love what you do, just do it. You don't necessary need to identify yourself with a title. But we do need that sometimes, eh?

Friday, 26 April 2013

Wikipedia

I am usually mad at common misconceptions that have somehow won people's opinions that they sound like self-evident. Ideas that you hear everyone reciting and just because everyone is saying that, it must be true. It's annoying how self-evidently these ideas are recited without trying to understand what they mean and how accurate they are, how strong they become because they are bound to be so now that everyone says so. And while they might have some merit, they don't feel absolutely so true if you try to make sense of them.

One common such idea that I am too tired to listen every once and then, when I happen to discuss something with a friend and suddenly mention "Oh,. I also read in wikipedia that blah blah...". And then he responds instinctively the same parroted words: "Wikipedia is not a reliable source, it's inaccurate, anyone can edit, blah blah..". Arghh!!! And what are the sources that you are proposing then? The local library? The university? Someone at NASA? What are the sources about specific subjects that you don't have direct access, and your best friend for that is the internet?

It depends on the subject. Of course I know the saying that you should not reference wikipedia in a scientific paper. Even wikipedia says so. Wikipedia articles are just a simple start up with a subject that also cites the original sources used. But the classic saying, "wikipedia is oh so faulty" occurs so robotically, like a Godwin's law, even when discussing about trivial information that are easy to verify or insignificant claims. And the absolute way they recite this saying gives you the impression that wikipedia is an extremely bad source of information, one of the worse on the internet. Which is not true, quite the contrary. Wikipedia is for me one of the best sources on the internet!

Why am I saying that? Whenever I hear about a new term, organization, person or anything I immediately reach wikipedia first. One would say, that is because it's good head start the easy way on a subject and then one can continue with linking to the references. One reason is this but there is more juice. Sometimes you can't be sure about something from different sources, because each source is opinionated or has different agendas. Wikipedia is maintained by thousand of users, each obsessed with different topics, trying to be as insanely correct as possible about each subject. Which of course can also lead to quarrels too, but it's resolved by several people and their different views might be blend (or not). It's as good as it can get given the nature of the internet. And sometimes you get a broader and more complete article where it says "The definition for this is A, but it's disputed by some other people, while that guy said it's C [citation needed]".

Search for example for the definition of hacking for example on the internet. Confusing! Information wants to be free, mentor's manifest, revolution,. no script kids they are, but there are true hackers, the others are called crackers,. confusing opinionated things, each site tells a different story. And that's where the wikipedia article shines! It says something like "1) Used to be the MIT guys, or a state of mind of creative people,. 2) then the computer hobbyists, 3) now the computer network intruders by the media,. etc. And these guys dispute the definition, while the other guys said about this, etc..". And then the history and then all the whole bunch of links and references. At least you get the whole image, what it is, what it was, what are the different opinions, how it changed through history, in a neat article that tries to be not too big yet covers the most important information for the basic understanding of the subject. Another example would be to learn about a company or organization you just heard about. Would you say that the best plan is to visit the organization's website? Of course, at first, depends on what you are searching too. But what if you want to have a more spherical objective view, rather than only positive promotion on the company's website? What if what you are searching is not "We have the best products, 20 years of positive experience, the happiest and most productive personel, blah blah" but raw words like "This company was started by that guy at 1980, in 1990 they produce that but didn't sell well, at 1998 there was a scandal with the CEO, blah blah". This is what I like in wikipedia! A good head start, raw words, factlike, no agenda towards only one direction, different views presented in a neutral way and additionally references to external link if you want to verify the facts.

When most of our sources come from the internet today (depending on the subject of course) and wikipedia has all these qualities (and if it doesn't, at least it's a good starting point) how can you say that it is a bad source? Then most of the rest of the internet are worse. Unless you believe that the internet itself is a bad source of information. Says the person who uses it everyday to read or spread information. How generic is that? The internet is you and me and everyone of us. It's like saying that people and their sayings are a bad source of information. Then what is left? This is the same as that old again robotic saying that users in Pouet are 95% illiterate and have nothing to do with the scene? WHAT? But Pouet is the ONLY such big demoscene community site where people come together, with discussions and everything! Pouet IS demoscene on the internet (but not in real life). Why do people say such things from mouth to mouth without thinking a bit? How can 95% at Pouet be irrelevant with the scene? And are the people who do these claims in the 5%? Then what the hell are they doing in a site that is 95% non scene related? I know that more than 95% who frequent there have produced something for the scene. But 95% of people kept reciting the same old song :P

Don't misunderstand me. I am not saying that wikipedia or the internet is the GOD. I am not saying it's 100% infallible. It's just as right as humans can be. When I am mentioning that I read something on wikipedia, I am not saying that it must be true. I am saying that I read something in some source (being the internet or not, it's irrelevant) and wondering what the other person has to say about this information. The stupid way would be to answer "The information is bullshit, because wikipedia is unreliable" and the more mature would be "I think this information is not correct because of this and that reason, regardless where you read it". Hell, it wouldn't even make the information more reliable even if I heard it from a person of high authority. It's like the old quarrels in debates, "I have 2 masters, 3 PHDs, I am the best in my domain, so you are totally wrong and stupid!" :P

Classic fallacy I guess, blaming the messenger instead of trying to understand and verify the message.

Sunday, 20 January 2013

GCW Zero rants

So, it's going very well now, after the upgrade of the hardware specs, a sudden spark of bids that I cannot explain came because for me the additional ram and storage is not very important but cool anyway. The biggest thing would be an upgrade of the LCD. Although I am used with 320*240, higher resolutions would be interesting to code demos and see how they perform. But not entirely necessary for the gamepark/dingoo community. I think another thing that is missing is a touch display, maybe only important if you want to properly play ScummVM or specific touch games I liked from my Caanoo. But anyway, you can't have anything and any change on the LCD might need more time redesigning the hardware, writing drivers and more costs too. As my latest favorite homebrew handheld was the Caanoo, it will fill like I am missing something, even though most software didn't make use of it.

Many argue that the resolution and memory was such a put-off, also the absence of a second analog stick. As I am coming from the gamepark/dingoo communities where even 64MB were enough for emulation I don't get the feeling that 256MBs are few. The first edition of Raspberry Pi had 256MBs of RAM. If you see the specs of many later or even modern game consoles, you will be surprised at how little RAM they had. One argued that indie devs won't optimize their game and 256MB would be too few. Well, if you have seen the quality and scale most of the homebrew games, they are very basic 2d stuff and they already played well on the old 64MB devices. Quake 3 needs 64MB Ram. Who is going to write such a big title on GCW? Even PS3 had 512MB ram, with the GCW spec updates we are up to par.

I think the resolution can be more put-off, it's one of the specs that actually never changed since GP32! Yes, the 800*480 of the Pandora would be a great asset, but I am personally content with what it is and for me it's like a more powerful Caanoo where finally the community gathers again. I know there are chinese android devices more powerful and cheaper than this, but those are released as new models, they have not a concentrated community (they do have the android market, but not the feeling of a specialized community coding stuff for exactly a single one handheld). As for the resolution, did you know that your Nintendo DS has 256*192? Just like my Spectrum 48k! And the games were awesome!!! We just don't realize it because the screen is small. Graphics looked so cool with the 320*240 resolution in all the past handhelds. Little problem for some emulated hardware that sometimes use higher resolution modes (SNES has higher resolution modes (512*240 and some interlaced), PSX has a 640*480 iirc, even my beloved Amstrad CPC has 640*200 in Mode 2 and goes way more with overscan screen). Those are rare cases though and I think I have still enjoyed some Mode 2 screens on a CPC emulator on Caanoo already. Maybe a higher resolution would be handy with web browsing, though this is not the main reason for buying the GCW0 (I just bought a Google Nexus 7 tablet for that).

As for the second analogue stick? Sorry, I am not into XBOX/PS3 FPS games, I am not gonna play modern FPS with this device (I would prefer mouse and keyboard controls on PC), I mostly liked the ports of PC fps on such device as a technical achievement rather than something I would play. And to tell you the truth, especially old FPS like Doom or Quake played well enough even in my GP32 with only two fire buttons on the right! Or so I think..

I guess the GCW0 is for a specific kind of people who already understand what the machine is meaned to be, who know the history of the open handheld communities from gamepark/dingoo times, and want something new with a strong community, for emu/homebrew gaming and retro game programming too. At first I was afraid this kickstarter would fail because there are few homebrew fans out there who would distinct this from your average android handheld (and the lame Neogeo-X who just has publicity and is the same chipset as GCW0 I learn, but locked to play only 20 Neo Geo games, overpriced too) but I see the reality is different, the support is great and hopefully this will live for long.

Maybe if this is successful enough and the creators want to go for a GCW1 then we might see a higher res and touch screen (which is the only thing that would be a big up to me right now) but let's see this one first develop a great community, might make me want to go back to coding something new for this device instead of just ports of my old demos. I'd like to make a game..

Thursday, 10 January 2013

GCW Zero

GCW-Zero: Open Source Gaming Handheld -- Kicktraq Mini

Finally! I really hope this project goes well. I backed up the 160$ (+ 20$ post) because I really want this one and it was little above 135$ (the least you can give to also preorder the handheld). I know, too much you will say. People tell me, why this when you can get a cheap phone? Because I hate touch. I can't play emuls with touch. Oh,. but you can add that hardware joypad that attaches to phones. I am not going to carry another controller among my phone with me!

I am talking about the niche category: homebrew handheld from community, not even commercial one that might be able to run homebrew (They tell me PSP is enough for emulation why do you need Caanoo? I haven't unlocked my PSP yet because it's too much work, my BIOS was updated and I have to find another one with broken PSP or something, fuck that shit!)

And I am talking about a homebrew made by the community supported by the community. I know there were some cheaper chinese android handhelds that even played N64 well, but these were not supported, nobody knows them, the chinese company makes many series of them instead of concentrating in a single one.

We needed something like a successor to Dingoo/Gamepark. Something for the homebrew console community (not touch, not android, just classic gamepark style!). Pandora failed to deliver and was too expensive. But now we have GCW Zero!

Some info
Forums
Kickstarter link

p.s. Once ago I wondered about the community and whether it would be nice to have an appstore. Well,. gamegadget (regardless if it was as powerful as an old dingoo) didn't go well (and I've heard the company didn't behave well), nD is for laughs (where?), so forget all that stuff and have the GCW Zero from several members of the community (see the kickstarter for a list). This is a more serious attempt. Scene gonna move!
p.p.s. Other good news. My old classic GP32 had died in 2008. I just decided to buy one used from Ebay. Not telling you how much I payed. But I really miss the nostalgy and there isn't even a proper emulator. I am just waiting for it!

Thursday, 15 November 2012

Old computers are not that junk as you might think

It happens that I have a quite different perspective about old computers. I like to stretch my feeling of what are the true capabilities of a very old computer. It's not nostalgia, it's not being fanatic, it's just a feeling that I love. That of seeing an old computer that people consider extremely slow, do really cool stuff on the screen.

And yet again, I have a much better appreciation of what old hardware can do than most people. Not just as a programmer but as a user of retro PCs too. There is that bias, where people observe what their current PCs can do and how performance consuming it is already, that makes them extrapolate that hardware from even 3-5 years ago would be extremely boring and you wouldn't be able to do much with it.

Well, the problem is that most software doesn't exactly represent well what the true power of older machines is. If you try to run the latest software in PCs of 5 years ago then yes, everything will struggle to a halt. And maybe the web is becoming more consuming with it's heavy loaded websites full of javascript and adds that it could really be a drag to try surfing the net with a Pentium 3 or 4. I don't know, I haven't tried this one. But the fact is that you can always find software written during that time when Pentium 3 was common and discover that you could really do lot's of cool stuff at that time.

An example about how extrapolating would make you think a 386 is a fucking calculator is this. So, I have my ultra new dual core PC and I bought a new graphics card. I install it and then there is the moment where I want to uninstall my old drivers and reinstall the latest ones. During that limbo time when there are no drivers install, maximizing or scrolling a single window, you see the very slow redraw of the window in realtime. Why I'd say? Many people would say it's the 2d gfx acceleration that makes it possible which are not there with no drivers. It could be. But so slow? I remember at times with my 486 and a crappy S3 Virge that redrawing was fast enough. One would say I was then with 800*600 and 16bit color while now 1920*1080 with 32bit color and he wouldn't be entirely wrong. But the message is that: A dual core AthlonXP without hardware acceleration draws windows slowly, so what would a 486 with an S3 Virge do if we extrapolated? Maybe they were still living in DOS? GUI in 486? No fucking way!!!

Or what would they think about the idea of a windows like OS running on an Amstrad CPC 8bit computer? Not even with bullets as we say in Greece :). Like maybe waiting 10 minutes for a window to redraw? Yet, those are the true stretched capabilities of what a CPC can do: SymbOS video (I know it's using CPC T-rex with Z80 at 24Mhz, but I couldn't easily find a good plain CPC SymbOS video showing enough features, though I have witnessed it myself and it's not much slower). Now, extrapolate to the opposite direction. If a CPC can do that, imagine what could a 286 do!

I know I know, a CPC has 16kb videoram while 286 has 64kb in plain VGA 13h mode and as the systems improve they add better resolutions and color depth which need faster gfx cards and CPUs, etc, etc. Maybe that's a good argument for the negative extrapolation, like yes the CPC had only a Z80 at 4Mhz but didn't need to output 8MBs of video memory during each frame as my 1920*1080*32bpp screen needs. And neither accelerated functions on the graphics card.

Sometimes I like to extrapolate in the positive way with my own projects. I have coded enough demos on Gamepark devices with ARM7 processor at 100 or 200Mhz or even the latest Caanoo at 533Mhz (my unit can overclock at 800Mhz). Of course, I know that Mhz is not everything and memory bandwidth or small cache can bring those things down a lot, but the point is that I can't help but admire those Acorn Archimedes demos I have watched again and again in an emulator (especially those from the group Xperience) where there is either an ARM2 at 8-12Mhz (old models) or later ARM3 at 33-40Mhz and yet you can see good 3d or 2d effects in resolutions similar to the Gamepark handhelds (ok, but their color depth is 8bit and not 16bit like the one I used). I know how hard it is because I tried to write some pixel per pixel effects on the GBA with ARM at 16.8Mhz where the resolution is ever lower (240*160) yet it took effort to even run a simple 2d effect in something like 25fps. I believe many GBA coders might optimize their routines with ARM assembly and I know that the XPerience group did that too. Now, my gamepark demos were pure C and I never felt like needing to optimize with assembly at the time I was writting my demos because the frame rate was already high enough at least for most of my effects, but I would love one day to try some ARM assembly (which I've heard is quite fun) and see what those beasts could do if their powers were harnessed!

Yes, I think of these devices as beasts. I think that even the old GP32 with ARM at 100Mhz (which I overclocked at 156Mhz most) could easilly be like a very fast 486 or early Pentium, excluding the FPU of course. Those monsters have a higher ratio of cycles per opcode (they said the ARM2 at 8Mhz is has 4 times more MIPS than 68000 at the same Mhz and I have witnessed it in some videos) than older x86 revisions or Z80 for example. Yet check the other fact: My smartphone is a slow one. Yet it has an ARM7 at 600Mhz similar to my Caanoo. But Android is a crawl (cause it's based on Java?) and unresponsive here. I have seen faster phones that can handle it, some with dual core processors. But it's a beast, I know that an ARM7 at 600Mhz is a beast because I consider even the GP32 with the 100Mhz ARM as a beast. And I've watched plenty of archimedes demos and tried RiscOS on a 33Mhz (emulated) computer. If I would extropolate from my latest smartphone running Android backwards I would feel like my current phone is a big piece of junk (which I do feel, judging by the unresponsiveness) but I know it isn't. I can appreciate what is in there and what would be really possible.

And then I'd like to just add to this conversation the counter-effect of false retro memories. Sometimes we remember our old computers as faster than they really are. Maybe because we were used to those speeds then and after coming back from the future it feels different. I was transferring some ZIP or ARJ archives of old games to my 386 (disc by disc, ugh) and then I decided to decrunch some of them. My initial reaction was: "What? I never remember decrunching being so long on my old PC! Was it really that slow?". Either I had false memories or being used today to instant decrunching times (even with larger archives) it felt so odd.

But it doesn't change how I feel with old computers. I don't look at them like pieces of junk that are as fast as pocket calculators. My friends look at me coding for old computers and wonder "A 386? What can you do with it? If I had only this I would be bored to death.". Hell, they even feel like this about a Pentium 4. But few years ago we used to work on these machines and we thought a Pentium 4 was like paradise, you could run the latest games, surf the web, watch movies, etc. The other thing here is that I don't see computers as media devices. I see them as things to experiment with, to program, to make them do whatever I like them to do. I feel the essence of code and creativity. Slow processors and primitive graphics don't bother me. I am so much happy even with the oldest thing I can program.

Some people might see random ugly pixels on a CPC and miss their HD media players and modern 3D, but I get enjoyment from deciding how to makes those pixels blink on the little CPC screen. And I discover that with some clever optimization those little 4Mhz can do a lot lot cool stuff! And If I am already so positively pleased with the feeling I have about the performance and creative possibilities I can harness from CPC, imagine how much more this is extrapolated for my 386 or the beasts we carry with us today. It feels so great, like I would never get bored of computers in the way I see them even if production of better computers would get into a halt. There are still a lot to explore even in computer communities where you think they have done everything and there is nothing more to see (just check the C64 scene, every year you see something that makes you think that this is the end, they have exploited the machine to it's full potential and yet you see new more impossible things few months later).

Creative use of computers makes this feeling possible. I could never enjoy a computer so much just from the mindset of a consumer.

p.s. I got into writing this post after being inspired by a twitter message about how angry the author was with people underutilising the CPU and then saying "let's do it in GPU, CPU is crap". Come on people, CPUs are huge beasts already! (well, I know GPUs are much greater beasts, though not as multi-purpose)

Tuesday, 31 July 2012

Bitness of a CPU


For yet another time, I've started asking questions to some local geeks about what defines the number of bits of a CPU. I was puzzled again because I have read different answers about the definition and unexpected values for specific CPUs. Then I decided to investigate this matter more, aka doing my homework (I won't sleep tonight :P).


The most common two opposing metrics are:

  • The size of the registers define the bits of a CPU
  • No, not only the size, but also the data bus width

Some considerations:

  • Is a Z80 a 16bit processor then? It has 16bit registers! No,. those are "fake" regs created by the 8bit pairs. Ok,. so it's the size of "true" registers. Question: How to realize that they are not real regs? More cycles needed? Very few arithmetic operations on the pseudo 16bit regs? 8bit internal bus?
  • When we say data bus what do we mean? In the second metric above, most people mean the external data bus which means communication between CPU and memory. When that argument arises they mean that one.

The chaos of terminology:

  • I tried to research into this by reading what are the definitions of terms like data bus, internal/external data bus, etc in wikipedia. I need to know the terminology so that I can understand the different arguments better.
  • In wikipedia article about data bus in the definition of internal and external bus I read this:  "Internal bus, also known as Internal data bus, memory bus or system bus or front-Side-Bus, connects all the internal components of a computer, such as CPU and memory, to the motherboard"  and  "The external bus, also known as expansion bus, is made up of the electronic pathways that connect the different external devices, such as monitor, printer etc, to the computer.".  Might be correct (what do you say?) but totally inconsistent with the way this terminology is used on the bitness arguments.
  • Although in the usual CPU arguments about bitness, people always mean by the external data bus, the CPU<=>Memory communication and the internal data bus probably the internal communication inside the CPU (between regs and ALU and who knows what other stuff). Also the same meaning is used in wikipedia articles describing every CPU. But in the data bus article, internal bus is all those stuff together and external bus is something probably irrelevant with the bitness of the CPU.
  • To make things more confusing, sometimes when arguments arise the don't even specify whether it's internal or external, for example "You are not right, this CPU is not a 32bit but it is a 16bit because the data bus is 16bit only". Which one? (I can only deduce that most of the time they mean the CPU<=>memory one, because in the vast majority of CPUs it happens that Regs size ==  internal data bus)

Other less common metrics I have heard:

  • ALU bits. It can mean many things (I haven't understood entirely yet)
    • Internal communication speed between CPU and ALU. In few words, our classic internal data bus definition (by people on the argument, not the wikipedia article I linked). Usually it occurs to be at the same bits as your CPU regs.
    • Do most standard arithmetic operations exist for those bits too? Example: Z80 with it's pseudo 16bit regs can just do ADD HL,DE/BC but not most of the other logical or other operations, so it's not 16bit.
    • Calculation bits per cycle! Is Z80 spending one cycle in ALU to do 8bit arithmetic calculations? I just found from two different random sources (just forum discussions though but nothing more official yet :P) that the ALU of Z80 worked at 4bits, that means 2 cycles are spent for a single 8bit add. Someone joked that if taken this metric in account, Z80 would be a 4bit processor and that made me wonder WTF?
  • Address bus width. Of course this is absurd and nobody brings it into argument. Yes, the Z80 and 6502 have 16bit address because they need to address 65536 bytes of memory but are they 16bit? Of course not! The classic PC segment/offset style was 20bit, does it make the 8086 an 20bit processor? Nope! Though, classic wikipedia articles on bits, like for example the 32 bit article say this:  "Also, 32-bit CPU and ALU architectures are those that are based on registers, address buses, or data buses of that size."  Of course you could say that it says that they could be based, not should be based, though it could be misleading for someone. Anyway, I am just mentioning this term too to exclude it from the bitness argument.
  • Another confusing thing is the bitness of the machine. Of course this doesn't make sense today and it was a marketing gimmick sometimes. For example Atari Jaguar. They said that it was 64bit and yet I know it had an 68000 as main CPU. Though I recently read in the Jaguar wiki page that among the 68000 it also houses several programmable graphics chips with 64bit architecture (does not define which CPUs) and the 68000 is just the chip that manages all these. Even the external databus between  those chips is 64bit I read now. Pretty strange architecture I'd say. But anyway, the bitness of a machine is a different and more subjective and not our current subject which is bitness of CPU.


So suddenly I read strange things about CPUs:

  • The 68000. I always thought of this as an 16bit CPU (maybe because people associated the Amiga or the Mega Drive as 16bit, you know the next generation/marketing confusion (counter argument: but why not market them as 32bit and be more impressive then?)) but some people refer it as 32bit. I read that it certainly has 32bit registers and possibly(?) 32bit ALU but surely the external data bus (CPU<=>memory) is 16bit and in one wikipedia source I've read that even it's internal data bus was 16bit. Most sources I've found distinguish it as a 16bit/32bit processor because of that (the 68000 wiki article cites this to the Motorola MC68000 Family Programmer's Reference Manual).
  • In contrast, the 386SX  is fully 32bit internally and with 16bit external data bus  (which is the crippled down version of the 386DX that was 32bit in that aspect too) yet I haven't heard anyone consider it a 16bit. In that sense we have another metric which says: Bitness of a CPU = Regs size & Internal Data Bus (and external data bus doesn't count).
  • Same happens with the 8088, which is a crippled down version of 8086. All is 16bit except the external data bus again which is 8bit. At most places I have heard it's a 16bit CPU except from few where I see the 8bit/16bit distinction.
And to finish my little big post, I'd like to say that I just wanted to put things in perspective, so that I can collect all these conflicting thoughts together in this post for a better overall view and maybe also read what other people have to say (if anyone ever reads this and decides to post anyway). I am not a maniac about the number of bits, I don't want to defend my favorite CPUs or something, it's only that sometimes I get too obsessive about discovering whether I can extract few certain facts from tons of conflicting opinions on the internet. If you go and read a single forum where they discuss bitness, you are never sure about the right definition, but if you search more sources and more discussion forums you start creating a more spherical view which might still be conflicting but at least you can collect these things together and try to make sense of them. I am just obsessing about making sense.

Anyway, even the CPU metric is sometimes subjective and as someone said, who cares about bits? Just program the damn thing! Z80 with it's fake 16bit regs is still an 8bit CPU (I never doubted it) yet you can do miracles with those additional opcodes (8-8 fixed point additions with quite less effort and cycles than on 6502) so it's still an advantage. And maybe instead of labeling a CPU with single bits, describing that it has 32bit regs, 32bit ALU, 32bit internal bus and 16bit external bus is a better way to decide for yourself. Or the more complete 16/32bit labeling would suffice. Or in words say: This CPU is 32bit internally but 16bit externally. Makes more sense.

Wednesday, 4 April 2012

Random Game Thought Metrics

Just some funny/strange/interesting thoughts. I was walking back home from the city center, all alone at night, thinking about random things. We had a Wii gaming session with a friend at his home previously, so I was thinking about games among other things. I remembered petermolydeux twitter which I wasn't sure at first if it was the real Peter Molyneux or a mock up, with his funny/weird/extreme game ideas and I decided to laugh about by speaking of ideas in the same style of games that actually exist. Imagine a mock up of the mock up for example stating things like these:


- Imagine a game where there is a princess in a castle and two italian plumbers trying to save her by entering in green tubes, collecting coins and eating mushrooms. At the end, they discover in great demise that the princess is in another castle.


- Imagine a game where you are a yellow ball with mouth and eyes, eating pills and being chased by ghosts. If you eat the big pill then you can chase the ghost yourself and kill them.


- Game where some blocks are falling and you are controlling them. Purpose of this game is to fill full lines with blocks. If you fill four lines then you gain the most score. At the end you might see a finale with a rocket flying into space.


- You control two baby dinosaurs that materialize bubbles from their mouth. These bubbles can capture enemies and you must then hit the bubbles to convert the enemies into fruits and other bonuses that you eat. If you miss enough time then a skeleton bubble appears and chases you to death.


Hehe, you get the concept? It's just a bit funny mockery taking existing well known games and converting them into petermolydeux style. Someone should do this twitter :)

But another thing I have discovered is that this is a good metric to understand how original an existing game concept is. Just take your favorite games and describe them in few words into petermolydeux style! Do they sound too common or too crazy then?

More examples of games that could be original for their time but the character/world/gameplay concept sounds too common:

- Game where you are a hero with a sword, you are trying to save a princess and when you hit the enemies or bushes diamonds appear with which you can buy a bow, bombs, a shovel, etc. You have to progress in different dungeons and fight final bosses till the end.


- Game where your girlfriend is abducted by a street gang and you have to beat the crap out of all the bad guys in the city and finally get her back. In your way you can find weapons in the streets like a baseball bat, a knife, a dynamite, etc. Sometimes you can also pick up metal barrels or huge rocks and throw them into enemies.


First example, it's Zelda of course. Doesn't it sound very typical? Classic medieval hero story. Everybody wants to save the princess or the kingdom or both. There are swords and dragons and beasts and evil mages. You get money as you loot your defeated enemies. You can buy weapons. Was it a bad game? Of course not. It defined it's own style of gameplay copied by several others. Maybe the concept wasn't crazy but the gameplay had it's own style (something like a mix of action with adventure and maybe little rpg elements (some people do not consider it an rpg)).

Second example could be any beat em'up. I had double dragon in mind. When the first beat em'up of this style was out (was it double dragon the first? I don't know. It's just the first I had seen) there was nothing like this before, even though the concept is common (something everyone might have easily thought).

So, it doesn't mean that if some ideas are too common ala molydeux style then it's not worth it. It's only a metric to realize how extreme or common the basic concept of an existing game sounds. How many other games do you remember where the main protagonists are baby dinosaurs? Is there any other single game where you capture enemies into bubbles and hit them to convert them to fruits? Bubble Bobble seems to be a game where the designers where taking LSD =)

Snow bros. A game which has copied bubble bobble but it's still original in it's own form. You capture enemies into snowballs and now you don't just break the snowballs, you hurl them into other enemies and the snowball might grow bigger until it hits a wall and kills the enemies. Even if it seems a copy, it keeps an originality because the concept sounds quite a lot different, so different that I might have not realized that it's just a copy of bubble bobble if I read a sort description. Now I could compare the sort description from Bubble Bobble and the one from Snowbros see it from a different view than just a lame copy (it was actually very good and successful even if it wasn't the first).

And now for something completely different (to close this blog post). I continued playing this game with one of my favorites on CPC. Fruity Frank. And then I realized something else I didn't thought about before.

- Game where you are a little man digging into some pudding and eating the fruits. Only the apples are heavy like rocks and can crash you so don't eat these. You also throw something like a little ball (actually two pixels :) that goes like ping pong in the walls and can hit the enemies. Funny thing is that the enemies are also fruits... wait!


..wait. Not fruits. I just realized, the enemies must be.... vegetables?

I am not sure about the little guys with the big noses (we thought those noses were bananas which we know them as fruits but I will come into this) but the purple guy somehow reminded me of eggplants. Now the green guy could be another vegetable. And how about the strawberry enemy. We all consider it a fruit but then I quote Sheldon Cooper: Not technically a fruit, but okay. :)


So, I realized that the concept is Fruits vs Vegatables. Even in the case of the strawberry that is secretly not a fruit (and the hardest and rarest enemy of the game). Wow! Did they really have thought this concept or was it random? We can't know. But definitelly fruity frank has it's unique identity, because as a character/world concept is unique enough, something that a petermolydeux description would show.

Oh yes! Fruity Frank could be thought as a war between Fruits and Vegetables. And strawberry is technically not a fruit (banana might not be a fruit too, or is it?). Hidden in the concept too. Wow! MIND BLOWN.

And other funny/weird/stupid thoughts on games..