Jump to content

Computers are the new magic


Christopher

Recommended Posts

This is how I feel our soceity develops.

I know computers are not magic. I understand them too well to ever have magic vision of them. But since I am active on a programmer forum I have a lot of contact with new programmers. And it appears certain fundamental knowledge is starting to disapear.

 

The following things do Computer and CS have in common with magic:

1. They are incomprehensible for those outside of the knowing circle

How the programmer sees it:

post-15884-0-12757700-1434676808_thumb.jpg

 

How the user sees it:

post-15884-0-52026400-1434676817_thumb.jpg

 

And the difference between easy and virtually impossible:

xkcd.com/1425/

 

2. They both have tradeoffs

PS238 brought me to that. Within a week of having read "magic has tradeoffs like that" I used it on a programming forum. "Programming has tradeoffs like that."

There is several Libraries and techniques that when used literally multiply productivity. You get so much more done in the same time if you use them, then if you did not use them. They all however come at the cost of runtime performance. All of them have the tradeoff of being less effcient.

Every Assembler language could be used to write an OS and the application running on the OS. But using anything closer to the machine then native C or C++ for the drivers and .NET or Java for the application and Direct X for Graphics work would be insane. You would need a decade for the simplest stuff.

 

Modern programming is done within some framework - .NET, Java runtime. Thier tradeoff for OS indipendency and hiding of pointers (see below) is that they are unable to do some stuff at all (too close to hardware). And they do not work in Realtime scenarios.

 

3. Raw operation is rarely done

Not to far ago every Windows application was written in native C++. Native C++ uses pointers. Those are perhaps the single most usefull and also the single most dangerous tool of a programmer.

95% of all Windows security patches were because someone *bleep*ed up handling pointers directly (wich is actually quite easy). All modern runtimes are developed with the goal of "don't let the programmer handle pointers, for codd's sake!". Or "don't let him input faulty indexers without alarm."

 

People today programm in Frameworks that were designed with those ideas. A lot of people programm in C++ for .NET rather then native C++.

As a result understanding that pointers exist or how they work is a dying knowledge. They used to be within the first 10 chapters of every programming book. Now they might not be in them at all, because we have suiteable replacements. Not faster or better, but saver and acceptably fast in 99% of all cases.

The last bastions of programming that close to the hardware with that much focus on optimisation are Operation System Kernels and Drivers. And the guys that make those frameworks. Of course the pointers are still around (we can't programm without using them anymore then humans can live without Oxygen), but they are managed by an infallible computer rather then a human.

 

Even Optimisation is left more to automatics. Wich is faster is actually not a question a programmer should even ask anymore.

The Compiler does a lot of optimisation work for the programmer. And the rest is propably be a matter of design rather then implementation.

Those runtimes also often have something called the "Just-in-time" compiler, wich is able to edit out "dead code" completely just before the code actually runs. So what you programm and what actually get's executed might be two very different things...

The JIT has caused it's fair share of issues and bugs, but he is too practical to live without.

 

4. It is hard to know everything (or at least enough) about computer

I have pretty wide and acceptably deep knowledge. But a lot of times people are strictly specialised. A programmer might literally have no idea how to set up a server in a network. Or how to setup a network beside "plug into router->magic".

I used to joke that "a person that knows everythng about computers would be insane", simply because it is too much knowledge and to much to learn. And also because you can't have any life besides learning that stuff, wich is quite insane too.

 

More and more we have issues with people comming to programming forums because thier network code runs into a networking issue.

Getting them to accept it is a networking issues can be hard. They are so accustomed to see everything as programming issues, accepting it is a totally different field of CS can be hard.

 

5. Knowledge of how stuff works is overrated anyway

We all know how to use a Telephone or smart phone to dail a number.

But how does it actually work? How is the speech (an analog signal) transformed into a digital signal? How does the phone choose wich tower to use? How does the handover betwen towers in the middle of a call work? How does the network protocoll between the phone and tower look like? How are the network between towers and the databases of the provider organised?

I know enough to know wich questions to ask, but never bothered actually reading up on them.

 

The term "cloud" was originally invented to mean "stuff I roughly know what it does, but I do not need to know details of how it works".

Most networking diagramms have one cloud shape in them - the internet. We know it is a network. We know it connects devices. But beyond that it is a minor detail. We could not sensibly draw wich routers and switches the signal goes over. And half the time our data would not even take the same route.

 

6. One word can make difference between life or death, boon or bane, win or loose

Everything Wrong with is a YouTube video series of critic in movies. Including Elysium:

 

 

Sin 106 was that it was "a single line of code could do that".

As a programmer I can say you: That is the most realistic part of the whole movie. This has been reality for decades. Those single lines of doom are DB Admins and DB Programmers bread and butter. And that system is just a really low latency Database.

We wish films about hacking or computers in general would be that realistic.

 

Few things in CS are quite as dangerous as Database operations. The difference between "increment that specific value there by one", "tell me wich value there" and "delete the whole database" is just what you write. All three go into the same slot and have the same level of security checks.

You can fry your whole database by forgetting to say "but only entry number X" in a delete or update order. Update is actually a bit more dangerous as most people are aware that Delete should have a clear mention for what you want to delete and a dropped DB will be caught and fixed quicker the garbeled data.

 

As a result only very few people are allowed to make changes to the DB. Less then are allowed to spend money or make day to day decisions like "should we shoot those down". Some say the admin is the most powerfull person in a company.

99.99% of that programm was just to get past every form of security that was possibly in the way (even that deep in the secure control room).

That one line was the entire payload and changing it from "Name X president" to "make all people on earth citizen" would actually be that easy. Wich was why getting in that Room and getting this programm was that hard in the first place.

 

7. Superstition is making a resurgence - even among the specialists

The combination of 1, 4, 5 and 6 mans that specialists are uniquely vulnerable to fall into superstition. Especially outside of thier area of expertise.

For an inexperienced programmer a lot of stuff I can easily identify and explain must seem mystical. "I don't know what is going on" is perhaps the most common sentence for Opening Posts in every Programming Forum.

 

The Tech priests of Warhammer 40k became a lot less ludicruos to me in recent years.

Link to comment
Share on other sites

I recommend the Wizardry series by Rick Cook. In it, programmer Walter Irving Zumwalt (Wiz to his friends) is abducted to a land where magic works. He has only the most rudimentary magical ability. Magicians there write very complex, elaborate, specialized spells that hinge on numerous variables. The spell changes based on the phase of the moon or how far from the equator the spellcaster stands. Wiz decides the only way for him to live in this world is to become a magician. He approaches the problem like a programmer. What is a spell? A spell is a series of magical instructions that produce a desired outcome. Wait. That sounds like a computer program: a series of digital instructions that produce a desired outcome. He sees that instead of writing one complex spell, he can write a series of small spells that when chained together would produce the desired outcome.

 

He then begins to write a spell compiler and names his scribing demon EMAC.

Link to comment
Share on other sites

I recommend the Wizardry series by Rick Cook. In it, programmer Walter Irving Zumwalt (Wiz to his friends) is abducted to a land where magic works. He has only the most rudimentary magical ability. Magicians there write very complex, elaborate, specialized spells that hinge on numerous variables. The spell changes based on the phase of the moon or how far from the equator the spellcaster stands. Wiz decides the only way for him to live in this world is to become a magician. He approaches the problem like a programmer. What is a spell? A spell is a series of magical instructions that produce a desired outcome. Wait. That sounds like a computer program: a series of digital instructions that produce a desired outcome. He sees that instead of writing one complex spell, he can write a series of small spells that when chained together would produce the desired outcome.

 

He then begins to write a spell compiler and names his scribing demon EMAC.

The thing is that this world once started magic the very same way. Just after several itterations/generations that basic knowledge got lost.

It became easier to use and learn the "precompiled" spells, then learning how to programm in magic-assemble (massembler? ass-magic?).

 

The bad thing is that nobody would help him debug those spells. So if his incantation of clean room has a bug that accidently summons a horde of fire demons, he would have to look long and hard to find it. Except this time around bugs can eat him and devour his soul.

 

Magic is a compiler with no documetnation. That might also behave differently based on unknown factors (like phases of the moon). The first step in debugging would require to get some form of human readable output (with meaningfull measurements). I somehow doubt magic has a decent ToString() function build into objects...

Link to comment
Share on other sites

The thing is that this world once started magic the very same way. Just after several itterations/generations that basic knowledge got lost.

It became easier to use and learn the "precompiled" spells, then learning how to programm in magic-assemble (massembler? ass-magic?).

 

The bad thing is that nobody would help him debug those spells. So if his incantation of clean room has a bug that accidently summons a horde of fire demons, he would have to look long and hard to find it. Except this time around bugs can eat him and devour his soul.

 

Magic is a compiler with no documetnation. That might also behave differently based on unknown factors (like phases of the moon). The first step in debugging would require to get some form of human readable output (with meaningfull measurements). I somehow doubt magic has a decent ToString() function build into objects...

 

And you've just described the plot of book 2 in the series. People are trying to modify Wiz's new spells to make them deadlier (a spell Wiz wrote to make magical creatures uncomfortable within a given area is modified to kill all magical creatures in a given area). Wiz's girlfriend calls in a team from Wiz's world (she finds them at a Ren Fair) to rework the compiler with better debugging, automated error checking that rejects spells with malicious behavior*, and proper documentation. Since every new spell has to go through the compiler, this will work for going forward. To make people stop using the old spells, they introduce "customer service" to the pre-established communications channel that all wizards and witches use to communicate with the council when spells go wrong. Then they send out virus demons to ensure that modified spells misbehave. The casters then call their old help line.

 

"Council of the North. Thank you for holding. What seems to be the problem today?"

 

"Besides that terrible music I had to listen to while waiting an hour? I've always talked to a wizard immediately!"

 

"I'm sorry. We have been experiencing a higher than normal call volume. Why did you call?"

 

"My spell misfired and summoned singing gnomes. Oh, that one's mooning me!"

 

"Which spell were you using?" (chewing sound as helpdesk operator snacks)

 

"DDT."

 

"That is not a known behavior of DDT. In fact, nothing in the code would lead to that. Are you sure it was DDT?"

 

"Well, I'm really using the variant, demon debug begone."

 

"Oh, yes, we are very familiar with that spell. It seems it was hacked together and there is a 10% chance that instead of killing it will summon creatures. You're lucky. You've only managed to summon an annoyance. [whispers] We had one village summon a fire demon, but don't mention that to anyone. We will teleport you a scroll of DDT 2.0. It will banish the singing gnomes. And in the future, only use spells featuring the 'Approved by the Council of the North' seal."

 

*One step of which is three demons named hear-no, see-no, and speak-no. If any of them see a problem in the spell they're scanning, they break into a Stooge Fu fight. If the spell has no problems, they sing as a trio.

Link to comment
Share on other sites

I once read about a few "laws of magic", that seem to be uniformly true. And at least to some degree they do apply to computers.

I am using this comic as reference:

http://www.theduckwebcomics.com/Mindmistress_at_Drunk_Duck/5484674/

 

The biggest issue is propably the difficulty of accessing that information or affecting those systems. Unlike magic computer networks are not "all encompassing". Yet.

 

"The law of Similarity - something resembling another thing, affects the thing":

There is a lot of need for similarity in computers. In order to "talk" computers need protocolls. Conventions on what can be transmitted, how it must be formed.

The Webbrowser that is moddeled after the HTTP Protocoll affects the Webserver modelled after the HTTP protocoll.

Without those all the data would just be ignored, like (literally) background noise.

Perhaps the simplest case is actually the time. 99% of all security systems will default to negative if both computers can not agree what time it is. Set your computer clock one year into the future and no certificate will be valid, no cookie will stay around.

No time server will even answer if the difference is more then 16 hours or so.

TCP hole punching is based on imitating similarity to another computer (with help of said other computer).

 

"The law of Contagion - something once attached to something, still affects it":

Laser Printers are know to embed a watermark on documents they print, making it possible to track something back to the printer that printed it:

https://en.wikipedia.org/wiki/Laser_printing#Anti-counterfeiting_marks

Licensing systems of Microsoft and Adobe equally "marry" a license to a piece of hardware. Between two installations of a programm on the same OS, user settings and licensing data is often kept (in the user profile).

With the new UEFI technology it became common to actually store the Windows serial key in the UEFI. The Windows 8 DVD is know to be able to read out that key assigned to the hardware automagically*

When you buy something in a PDF shop, you get a personalised version. In the sense that the order number and original buyer are noted in the document. So if you distribute it they know who did it and can get back to you.

In NTFS, the alternate stream can carry quite some data. It is often used to mark files as "came from internet, might be unsafe" and will persist across all other NTFS media.

 

All this information can be taken out, by those that know how too.

 

"The law of nomenclature - the name is the thing. Holy or demonic names are carved into objects to increase thier power":

It used to be that the IP adress was the one unique identifier for a server, that rarely (if ever) changed. Names and the DNS servers were only added to have human readable names.
With the increasing shortage of IP adresses, the inverse happened. DynDNS trives on the whole idea that the IP adress changes. Administartors are encouraged to use names rather then a hardcoded IP adresses. The name is the server. Often the only pieces in a network with truly fixed IP adresses are the Router and DHCP server. All others will be give one by the DHCP server.

 

Zerificates. They are the 3rd leg of proper encryption. Everyone can make one. But only those made by the Zertificate servers are fully trusted.

"By the authority of 'GlobalSign nv-sa', I truly am Wikipedia. There is nobody intercepting and changing the communication between us".

A Zertificate backed up by a proper authority has more weight then just any zertificate.

 

In a company network it can be said "By the power of the Domain controller and User account management server known as [insert server name here], I beseech you to grant me access to this printer".

Link to comment
Share on other sites

Man, I could pad that out to the size of a phone book.  Off the top of my head:

 

The Law of Corruption: The larger and more complicated a system is, the more likely it is to attract demons or other corrupting influences that will eventually destroy it.  (This is happening to my Exchange database as I type this.)

 

The Law of Security: The more secure a system is, the less likely it is that it will actually work.

 

The Law of Spellbooks: Procedures that have been written down may not function as advertised even if followed to the letter.  The environment will almost certainly have key differences--software versions, for example--and the procedure itself may be missing key steps or contain other errors.

Link to comment
Share on other sites

Man, I could pad that out to the size of a phone book.  Off the top of my head:

 

The Law of Corruption: The larger and more complicated a system is, the more likely it is to attract demons or other corrupting influences that will eventually destroy it.  (This is happening to my Exchange database as I type this.)

I have a more generalised law for this:

"The law of Fragility - a procedure can be as tough as a bunker. While also being a cardhouse only waiting for the right gust of wind."

"Never change a running system" has been a rule for decades.

Link to comment
Share on other sites

Corporate scheming can sometimes lead to the worst possible idea becomming reality (example of the PS3):

https://youtu.be/FoGnDgqNGkk

 

Summary:

The PS 3 was known for a total lack of backwards compatibility. Something they spent some time fixing. In turn it's games have terrible upward compatibility too. At the core of both isses is the unique processing design they choose, the "Cell architecture".

The video points out that Sony might have choosen the whole Cell architecture (the core of many issues) to make porting impossible. It propably was an attempt to "lock down" the console market in order to drive out XBox.

Add to that they wanted to push BlueRay players into the market to win the new Format Wars. Those made the console quite expensive, leaving no room to add in the chips and connectors needed for backwards compatibility. Running PS1 games was possible (it is kinda like running a DOS game on a modern Windows Computer with an emulator).

You don't have the hardware you need for BC. The price was high. Few games were developed for it (because the design was terrible and you could hardly port those products). The only reason it did not totally fail was because the competition screw up worse.

 

 

Format/De Facto standart Wars:

This is closely related with the previous thematic. Quite often the big manufacturers or introducers for a new "Tier" of technology will engage in a format war. This is always about long term profts. Whoever has the "winning" format will get a steady cashflow from it for 1-2 decades.

The most recent and prominent ones have been HD-DVD vs Blueray* wich close mirrored VHS vs Betamax* (as both affected Videostorage for generations to come including for the home user). But there have been a lot more and once one was actually waged between AC and DC forms of electricity (AC won).

https://en.wikipedia.org/wiki/Format_war

Would an industrialised magic have similar format wars? Similar tiering of "magic technology"?

 

 

*Inofficially it is also said that both were decided by the pornographic industry.

 

And interesting side effect might be that an old record might only exist in teh loosing format. In Cowboy Beboop it was actually a plot point that a recording was Betamax rather then VHS.

Link to comment
Share on other sites

  • 3 weeks later...

There is actually stuff man is not supposed to think about in computers - Timezones. Because Lovecraftian horrors have nothing on this insanity:

 

Whenever somebody ask about issues with storing and retreiving a datetime, the answer is:

Pick a single culture and datetime format for your code. Use it independant of local culture.

Always store and retrieve the datetime in UTC notation.

Let the UI code deal with expressing it in local time.

Don't try to understand it. It just works. (unless you deal with astronomical time).

And once you leave the gregorian calendar or even go before 1970 all bets are off.

Link to comment
Share on other sites

Don't try to understand it. It just works. (unless you deal with astronomical time).

And once you leave the gregorian calendar or even go before 1970 all bets are off.

Snort.

 

Snigger.

 

Chuckle.

 

Hee hee hee.

 

Moo hoo HA HA HA HA HA HA HHHHHHHAAAAAAAAAAAAAAAAAAAAA!!!!!

 

Hee hee hee hoo hoo hoo HOO HOO HOO HOO WAAAAAAA-HAAAAAAAAA HAHAHAHAHAHAHAHAHA!!!!!!!!!

 

HAHAHAHAAAAAAAAAA!! HO HOO HOOOOOOOOOOOOOOO WAHAHAHAHAHAAAAAAAA!!!!!

Link to comment
Share on other sites

This is how I feel our soceity develops.

I know computers are not magic. I understand them too well to ever have magic vision of them. But since I am active on a programmer forum I have a lot of contact with new programmers. And it appears certain fundamental knowledge is starting to disapear.

 

The following things do Computer and CS have in common with magic:

1. They are incomprehensible for those outside of the knowing circle

How the programmer sees it:

attachicon.gifsight - programmer.jpg

 

How the user sees it:

attachicon.gifsight user.jpg

 

And the difference between easy and virtually impossible:

xkcd.com/1425/

 

---SNIP---

 

Hey Christopher!

 

You just described electronics in general. :winkgrin:

Link to comment
Share on other sites

Hey Christopher!

 

You just described electronics in general. :winkgrin:

Yes and no.

Electronics in general are dictated by the laws of physics. They define how complex something has to be and how complex something can be.

"Debugging" here is just finding out where your math is wrong or where the reality (after measuring) does not match asumptions.

 

It's still an awesome work. But we programmers added a whole nother level to it.

 

We programmers are not limited in the insane complexity we can create. The only limit is our imagination and our drive to do it "even faster".

We are the only jobgroup on the planet that are able to make problems we ourself can not solve anymore.

"Computers help us solve problems we would not have had without them".

Link to comment
Share on other sites

Yes and no.

Electronics in general are dictated by the laws of physics. They define how complex something has to be and how complex something can be.

"Debugging" here is just finding out where your math is wrong or where the reality (after measuring) does not match asumptions.

 

It's still an awesome work. But we programmers added a whole nother level to it.

 

We programmers are not limited in the insane complexity we can create. The only limit is our imagination and our drive to do it "even faster".

We are the only jobgroup on the planet that are able to make problems we ourself can not solve anymore.

"Computers help us solve problems we would not have had without them".

 

Actually yes and yes.

 

Most of you do not even know how a circuit works.  Now that they have the magic IC chip, most electronics types are not even taught electronics anymore.    They don't even understand what a chip is. 

 

Back before the "modern" programmer,  programming required a knowledge of both hardware and software.  I can remember programming when we used punch cards and mag tapes were the method of information storage.  So yes before your day some of us did program.  And we do know what it entails.

 

And you are very wrong with you assertion that electronics follows ironclad "laws of physic's".  If you were acquainted with it, you would realize that while some are mathematical laws, there is a very healthy dose of theory in the mix.

 

We used to roll with laughter when the ivory tower crowd ventured into the field and hit real world operational reality head on.   "That's Impossible" and "Electronics doesn't work that way" would come fast and furious. 

 

Luckily, for every super duper look at me I have a degree out there, we still have some actual real world Technicians and Engineers left.

 

But anyway, feel good and I hope your haughty outlook in life treats you well......

Link to comment
Share on other sites

  • 2 weeks later...

Actually yes and yes.

 

Most of you do not even know how a circuit works.  Now that they have the magic IC chip, most electronics types are not even taught electronics anymore.    They don't even understand what a chip is. 

 

Back before the "modern" programmer,  programming required a knowledge of both hardware and software.  I can remember programming when we used punch cards and mag tapes were the method of information storage.  So yes before your day some of us did program.  And we do know what it entails.

 

And you are very wrong with you assertion that electronics follows ironclad "laws of physic's".  If you were acquainted with it, you would realize that while some are mathematical laws, there is a very healthy dose of theory in the mix.

 

We used to roll with laughter when the ivory tower crowd ventured into the field and hit real world operational reality head on.   "That's Impossible" and "Electronics doesn't work that way" would come fast and furious. 

 

Luckily, for every super duper look at me I have a degree out there, we still have some actual real world Technicians and Engineers left.

 

But anyway, feel good and I hope your haughty outlook in life treats you well......

I got an partial eductation in electronics. I can read a plan, I understand the laws of electronics, know how a circuit works, know of the different types of switches and short circuits, the danger of AC over DC, the trickyness of measuring how charged a battery is, differences in currents by country and color coding of wires.

In fact I started with wanting to be an eelctrician. The theory was easy. I just failed utterly in the practice. My hands and hand/eye coordination sucks. As in "I have a disability there".

I know not only of programming, but also a fair share of adminstration and electronics. I keep my mouth shut about stuff I don't know certainly. And I am one of the first to admit if I am wrong.

 

Let's make a comparision:

Do electricians on any level still learn how a and why a resistor has a certain resistance? There is a physical law behind that, but even I did not learn that. I only learned how to read the colored code and how they work in formulas.

Do electrician on any level still learn Ohm's law? I wager they do.

The job a resistor has never changes. You might make more complex layouts of resistors and parts. You might have pre-fabricated parts (IC's or boards) to connect to another for larger stuff. But you still learn what a resistor does and how it behaves for the formulas. In the end one complex part or pre-made board is just a resitor or piece in a bigger plan.

 

The average programemr today might not even learn what a pointer is. Much less how to programm with them (and the non-existant difference between data and code). And they are only 3 levels distanced from even the most interpreted language. (Source Code - Intermediate Langauge - Executing programm&OS).

Pointers are as fundamental a part for programming as resistors and thier formulas are for electronics. Pointers and thier work does not really matter for higher level stuff anymore. So it is not learned.

That is how complex and away from the hardware programming has become. We have more issues designing the hardware to work with our software, then the other way around. Only with Windows 7 did we finally were able to get rid of decades old legacy stuff like "BIOS" and "MBR". We dragged that around for so long not because we liked it. But because we needed backwards compatibiltiy for our software.

 

If you take an electrician and send him to the start of electornics development, he would have to ask a few values (currents, resistance) but otherwise could work with what they have there. That is how little the job has changed.

A modern programmer having to learn how to us punched cards (and associated computers) or even just assembler langauges would learn a lot of totally new stuff. Most of the stuff he learns does not even apply to such a low level of programming.

Link to comment
Share on other sites

There are levels of abstraction in both fields. For example, if you're building a circuit with digital logic elements, you can generally treat them as being no more complicated than the Boolean algebra expressions they represent. There's a lot of quantum mechanics involved at the base level of the individual transistors that make up the circuits, but they're made to insulate the user from that complexity.

 

Even if you're building an analog circuit with discrete transistors, you can frequently abstract the behavior down to a simple current ratio.

Link to comment
Share on other sites

  • 2 weeks later...

It's just as Arthur C Clarke said "Any suitably advanced technology is indistinguishable from magic" and some of  the corollaries Any technology distinguishable from magic is insufficiently advanced. (Gehm's corollary)"Any sufficiently analyzed magic is indistinguishable from science!" and finally "Any technology, no matter how primitive, is magic to those who don't understand it".

 

My favorite explanation of magic is " magic is the cheat codes for the world" Warren ellis's character Drummer in  Planetary issue 7 (one of my favorite issues of an amazing series)

 

I've put a lot of thought into the concept of magic as mathmatics kind of a thought experiment to unify some of my favorite british paranormal/ urban fantasy novels.

 

the laundry novels by charles stross have the concept of computational demonology the rivers of london series with its  newtonian magic and the milkweed triptych with old enochian, time travel and nazi super soldiers.

 

much less grim is Off to be the wizard the first in a series of books about a computer hacker who discovers a mysterious file that lets him access the source code of reality unfortunately creating millions of dollars in his bank account gets the feds on his tail and he has to escape to medieval england where he discovers he is not the first to discover the secret file answering the question just how bad would things get if a bunch of nerds had unlimited power.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...