(I)t is impossible to convey the life-sensation of any given epoch of one’s existence — that which makes its truth, its meaning — its subtle and penetrating essence. It is impossible. We live, as we dream — alone. – Joseph Conrad (Heart of Darkness). We live as we dream, alone, is also the title of a great Gang of Four tune. What does any of that have to do with anything? Quite a lot actually. Dreams are how the mind processes data in an unfettered state. It is when the mind can access and explore all the possibilities of any given concept. You can’t fly or walk through walls while waking but you can while dreaming. In your dreams you can be a successful business person or a famous artist or a hot lover. Sometimes all of the above. On rare occasions people have been able to harness those possibilities and turn them into reality. Sadly, they make the assumption that everyone else can too so you keep hearing shit like “follow your dreams” instead of “get a fucking job.” Worse are those who see the two precepts as somehow conflicting. They aren’t. Just ask any famous actor who’s bused tables or mowed lawns or done porn. That last one no longer carrying the stigma it once did now manages to fuel dreams of a different sort.
In that vein I’ll be the first to admit that my dreams have taken me to some fun and unusual places. But not even in my wildest erotic fantasies have I thought of being raped by a pig.
According to Angie Houston of Ellis County Texas – what, you thought this happened where sane people live? – her dreams came true and the family pig knocked her up.
Angie Houston recently told Ellis County News that she expects to give birth to a baby pig.
Houston alleges that one night she came home from a night of playing miniature golf and was pounced on and raped by a 400lb boar named Pete.
“People think I’m crazy and need mental help because of my story,” Houston said. “They ask how’d the pig get my panties off to mount me? I wasn’t wearing any panties is how! This pig been hot for me for years. Constantly sniffing at my genitals. He’s tried to rape me a hundred times. This is just the first time he successfully got inside me.”
Angie’s parents are supporting her claims.
“Angie knows she can’t have sex until she’s married and she promises she hasn’t,” her father, Don Houston, said. “Pete’s always been an ornery pig, so it’s not surprising he raped our daughter. After this incident, I butchered Pete out. The bacon I made him into is helping feed Angela and her baby; after all, she’s eating for two now and needs a lot of extra meat.”
Angie has refused to see an obstetrician to check her and the baby’s health or get a sonogram to see what the potential “pig baby” might look like.
“I just want to wait and be surprised by what my baby looks like,” Angie said. “My hope is that that it’s not a mutant pig of some kind and it just comes out looking like a normal human baby, which I feel, will be the case.
This is a pic of the happy couple.
Okay, show of hands, how many of you think she’s lying?
Well, that’s all of us then so let’s move on.
But what if you’re not into pig fucking. In fact, what if you’re not human? What do you dream of then? According to Alex Hern at the Guardian, your electronic brain comes up with some very cool shit.
What do machines dream of? New images released by Google give us one potential answer: hypnotic landscapes of buildings, fountains and bridges merging into one.
The pictures, which veer from beautiful to terrifying, were created by the company’s image recognition neural network, which has been “taught” to identify features such as buildings, animals and objects in photographs.
They were created by feeding a picture into the network, asking it to recognise a feature of it, and modify the picture to emphasise the feature it recognises. That modified picture is then fed back into the network, which is again tasked to recognise features and emphasise them, and so on. Eventually, the feedback loop modifies the picture beyond all recognition.
At a low level, the neural network might be tasked merely to detect the edges on an image. In that case, the picture becomes painterly, an effect that will be instantly familiar to anyone who has experience playing about with photoshop filters:
But if the neural network is tasked with finding a more complex feature – such as animals – in an image, it ends up generating a much more disturbing hallucination:
Ultimately, the software can even run on an image which is nothing more than random noise, generating features that are entirely of its own imagination.
Here’s what happens if you task a network focused on finding building features with finding and enhancing them in a featureless image:
The pictures are stunning, but they’re more than just for show. Neural networks are a common feature of machine learning: rather than explicitly programme a computer so that it knows how to recognise an image, the company feeds it images and lets it piece together the key features itself.
But that can result in software that is rather opaque. It’s difficult to know what features the software is examining, and which it has overlooked. For instance, asking the network to discover dumbbells in a picture of random noise reveals it thinks that a dumbbell has to have a muscular arm gripping it:
The solution might be to feed it more images of dumbbells sitting on the ground, until it understands that the arm isn’t an intrinsic part of the dumbbell.
“One of the challenges of neural networks is understanding what exactly goes on at each layer. We know that after training, each layer progressively extracts higher and higher-level features of the image, until the final layer essentially makes a decision on what the image shows. For example, the first layer may look for edges or corners. Intermediate layers interpret the basic features to look for overall shapes or components, such as a door or a leaf. The final few layers assemble those into complete interpretations – these neurons activate in response to very complex things such as entire buildings or trees,” explain the Google engineers on the company’s research blog.
“One way to visualise what goes on is to turn the network upside down and ask it to enhance an input image in such a way as to elicit a particular interpretation,” they add. “Say you want to know what sort of image would result in ‘banana’. Start with an image full of random noise, then gradually tweak the image towards what the neural net considers a banana.”
The image recognition software has already made it into consumer products. Google’s new photo service, Google Photos, features the option to search images with text: entering “dog”, for instance, will pull out every image Google can find which has a dog in it (and occasionally images with other quadrupedal mammals, as well).
So there you have it: Androids don’t just dream of electric sheep; they also dream of mesmerising, multicoloured landscapes.
All right, so now you can Google for pigs and get more than you ever dreamed of. That’s kind of cool I guess. But what if you want something more specific?
According to Caroline Reid at IFLScience, the universe has your ass covered. Scientists are working on, and have developed the beginnings of, a mind reading computer.
Right now they have this modern marvel turning your thoughts into written words. Brain-to-Text they call it.
The study designed to test this new concept, published in Frontiers of Neuroscience, required participants who already had electrodes fitted in their brains. This is because external, brainwave-reading caps, which record electrical activity across the scalp, are not sensitive enough to pick up the sharp signals needed to identify individual letters. The skull blurs this sensitive information.
This limited the number of people who could participate in the trial to seven, all of whom suffered from epilepsy and already had electrodes implanted in their brain to treat it. Unfortunately for the researchers, the electrodes were only put in the regions of the brain that required rewiring, and thus were not evenly distributed everywhere.
With no way around this limitation, the participants were asked to read different passages of text aloud while their neural data was read by a computer. The passages read included JFK’s inaugural speech, Humpty Dumpty, and even Charmed fanfiction.
As the individuals spoke the words, the computer had to learn to recognize the individual sounds they were making and match it to the corresponding brain wave. Eventually, the computer was able to pick up different brain patterns and match them to sounds.
The results were encouraging. The Brain-to-Text software was consistently more accurate at classifying phonetics than a randomized model.
“This is just the beginning,” said Peter Brunner, a coauthor of the study. “The prospects of this are really endless.” The paper comments that traditional speech-recognition software has thousands of hours of acoustic data to model and refine the software, whereas Brain-to-Text has just two or three samples from seven people. With more trials and tweaking, the software can only get more accurate.
The technology can’t easily be made commercially available because when it comes to brains, one size definitely does not fit all. The brain waves that transmit phonetic data are so sensitive that every brain will need to be assessed individually. Also, there is the issue of inserting a network of electrodes directly into the user’s brain. The increase in quality of life therefore needs to be greater than the risk of brain damage or surgical complications.
I can’t wait for scientists to hard wire Angie and see what comes out.
In all seriousness the possibilities for tech like this are staggering. People like Stephen Hawking could talk in real time. The severely disabled would no longer have to rely on some random caretaker’s interpretation of their movements to make their desires known.
“That’s how Timmy blinks when he wants pudding” may not be as accurate as they hope. For all we know it’s Timmy’s way of saying “KILL THE FUCKING PUDDING PEOPLE! MAKE THEM STOP!”
This tech could clear that right up.