Jun 20

Women in Computer Science

Yesterday, Google announced their “Made with Code” initiative, which hopes to inspire young girls and, ultimately, get more women into Computer Science (official blog post, official press announcement). As a precursor to that project, Google performed a nation-wide research study to determine what factors do (and don’t) impact a young woman’s decision to pursue Computer Science in college. Guess who was tapped to write the whitepaper that summarized the findings? Hint: it was me and it’s finally available publicly.

As a woman in Tech and a Computer Science graduate myself, getting more women interest in and actively pursuing Computer Science is an issue near and dear to my heart so I was super excited to help with the whitepaper and I strongly encourage reading through it; especially if you have girls/young women in your life, but even if you don’t. Some of the findings were a little obvious, but some of them were also kinda surprising (at least to me). There were a number of editorial cycles, but I promise I worked hard to make sure it was as generally accessible as possible. Encouraging female participation in Computer Science is something everyone can, and should, care about. After all, both the very first programming language and the very first compiler were created by women (Ada Lovelace and Grace Hopper, respectively). We’ve been in the field since the beginning, let’s make sure we’re still here in the future.

Jan 08

Down the Rabbit Hole Or, How a Writer Became a Scientist… and then a Writer Again

I can still remember the day the course of my life shifted the first time: it was a cold, winter evening in Maryland and I was sitting at the kitchen table staring at a list of available courses for the high school I expected to attend the following year.

An offhand comment from a teacher about prerequisites had inadvertently sent me on a mission. We had pseudo-electives in middle school: each semester you could take any two of Art, Home Economics, or Shop class, and the decision for one semester didn’t alter what you could take the next. The revelation that I could choose my high school electives poorly as a freshman and lock myself out of interesting classes as a junior or senior upset and scared me, and I was determined to be ready when the time came.

I had been squinting at the documents for hours, trying to reconcile the list of available courses with the list of requirements for graduation as defined by the school and the list of requirements for graduation as defined by me. But I didn’t really understand what I was looking at and I hadn’t made any progress.

I glared in the fading light at the pages in front of me and finally looked up at my mother who was cooking nearby. “I don’t know what I should pick…” I paused to look accusingly at the pages again, “I guess I should pick something that will help me for college and my job and stuff… but…. “ I sighed and looked at her expectantly. “What do you think I should do?”

My mother stopped cooking and gave me her full attention, “What do you want to do?”

“I want to write.”

“Then be a writer. Pick classes that will help you with that.”

But I didn’t know what classes would help with that. More importantly, I couldn’t think of a writing job that wasn’t writing fiction, being a journalist, or teaching English and I didn’t want to teach or be a journalist. And my fiction was terrible.

I could have asked someone for advice or suggestions. But for some reason it felt like these were decisions that I should be making on my own. Like they were going to shape my life.

And they did.

At thirteen, I decided I would follow the third of my three passions – the first being food and the second being writing – and become a scientist. It was the “Plan B” that gave me something to fall back on until I could figure out a way to make a living as a writer.

At first, because I was interested in Astronomy and Cosmology, I was determined to be an Astrophysicist, but I realized early in high school that applied math (and as a result, Physics) was not for me. It was boring. And not just a little boring, but really, really boring. Fall-asleep-in-class-then-wake-up-to-find-a-puddle-of-drool-on-your-desk, boring. I started to think the whole Astrophysicist-thing wasn’t going to work.

I never made it to the high school whose course listings I had studied so intently all those years ago; we moved to Hawai’i the summer before I started high school and then to California two years later. The heavy sunshine and bright bird song that slipped through the slats of my bedroom window on the back of a cooling breeze couldn’t have been more different than the winter stillness of that fateful day in Maryland, but the scene was essentially the same: I was sitting on my bed pouring over college course descriptions and degree offerings and trying to figure out what to do with my life.

I had no intention of spending four years of college drooling on my desk just to get to the “good stuff”, so Astrophysics was definitely out. But now my senior year was looming and I hadn’t picked a replacement.

Once again, the prompt that directed my life came from something small. My family’s first computer had been a Commodore 64 and it came with a tutorial on programming in BASIC. Most of my early coding attempts were abysmal failures, but I had written one program, a number guessing game, that I had been very proud of. I decided I could stand to do a little more of that, settled on Computer Science as a major (Computer Engineering had too much applied math), and sent out my applications.

There weren’t any drool puddles to disrupt my course this time so eight years and two Computer Science degrees later I was working full time for a small biotech company in San Francisco. And that’s where I stayed for six years: first as a Software Engineer, then as a Senior Software Engineer, and finally as a DBA/DBE.

I enjoy working in Software.

I don’t particularly like the nitty-gritty, day-to-day of coding, but I enjoy everything else: the design work, tackling complex problems, distilling order from chaos, and the teamwork. Especially the teamwork. Coming together to conquer what feels like an insurmountable problem, what my father calls, “staring into the face of failure and making it blink”? That was the best part.

But something changed in those last two years.

We were working on a monolithic project – a Software/IT overhaul for a CLIA certified lab – and the documentation requirements were immense. As the only person on the Software team who actually enjoyed writing, some of the documentation work (as much as my other responsibilities would allow) was delegated to me. Maybe I had reached my saturation point for tech-related stress, or maybe I was just tired of doing battle with petulant database installations, but I eventually noticed that the days I woke up eager to go to work were also the days that had been set aside for me to work on documentation.

And I hadn’t been eager to go to work in a long time.

The realization hurt. It was like an old friend I really cared about but had been ignoring for years suddenly showed up in my cubicle, hugged me warmly, sat down, and asked me reproachfully, “Why don’t we talk anymore?”

If present-day me, high-school me, and 13-year-old me have anything in common it’s this: we love to write and we’re not afraid to follow through on life-altering decisions made between heartbeats. I couldn’t answer my friend’s question, but I knew how I was going to fix things. I sat at my desk, watching output from the latest Oracle installation I had bullied into behaving scroll by, and knew I was done working as a Web Developer. Or as a Software Engineer. Or as a DBA.

I was a Writer.
And it was time to write.

Little did I know then how different things would be on the other side of the looking glass…

Nov 01

Parody Takes More Work Than a Gotcha at the End

On Halloween afternoon, an article entitled 10 Signs Your Girlfriend is a Fake Gamer was published at ComplexGaming. The author, Justin Amirkhani, has stated that the article is a parody and I believe him, but he is also been very dismissive of criticism:

and that is a problem.

Here’s the thing: whatever else a writer is trying to accomplish, their primary objective is always communication. It doesn’t matter if your ostensible goal is to educate, entertain, or expound; if people are rage-quitting in the middle of what you’ve written, then you’ve failed as a writer because your message is being lost. This is particularly problematic if, as is the case here, you’ve saved the entirety of your social commentary for the last paragraph of your piece.

Unfortunately, good intentions do not negate poor execution and, regardless of his intentions, this article fails as parody. A parody is satire and satire, by definition, leverages irony, sarcasm, and ridicule to make social commentary. But there is nothing overtly ironic or ridiculous in this article and the one sarcastic line: “Hey, wait a second, why does that sound so familiar?” isn’t delivered until the absolute end. What the article does do is devote the vast majority of its word count to regurgitating unhelpful cliches and trying to undo that sentiment with a single paragraph at the end. That isn’t satire. That’s reinforcing stereotype.

The biggest problem is that the images used in the article – every slide has an accompanying Fake Gamer Girl meme – undermine Amirkhani’s intent. In most of the slides the meme image is actually larger than the accompanying text, which means the reader will process the meme first and that message colors the rest of what they read on that page. The result is that even though the text of the article is gender-neutral, the title of the article (10 Signs Your Girlfriend is a Fake Gamer) and the attention-demanding memes accompanying every slide work together to establish social context and provide an implied subject for the judgments being presented: women. Or, more specifically, women “pretending” to be gamers. And while I do believe Amirkhani’s comments were meant to be interpreted as ridiculous, the fact remains that similar accusations have been levied – in complete seriousness – at women as so-called evidence that they are pretending to like games for the attention. The unfortunate result is that Amirkhani’s comments don’t come across as obviously ridiculous; they come across as more of the usual cultural inanity where female gamers are concerned.

Whether or not you subscribe to the idea that women are frequently and unfairly required to justify their right to call themselves gamers is irrelevant. Plenty of people do believe it and plenty more find the Fake Gamer Girl meme offensive. If you are going to give your article a deliberately inflammatory title and leverage a meme you know will offend some percentage of your readers for the sake of parody/satire then it should serve a purpose and reinforce the absurdity of the situation you’re describing. Amirkhani’s article doesn’t do that. It presents the images without comment as a compliment to the sentiment in the text and, with no clear evidence to the contrary, the reader is expected to understand/believe that Amirkhani doesn’t really mean what he’s saying.

Parodies, particularly good ones, sound logical and reasonable on the surface but there is always something glaringly Not Right™ about them. When the supposedly ridiculous things in your article have actually been said and meant sincerely in reference to some portion of your readers, it stops being obviously Not Right™ and starts being Not Right™ only if your readers trust that you don’t actually mean what you are saying. And that’s the main reason this article fails: an article being interpreted as a parody should not depend on the author’s reputation, it should be clearly obvious from the text. Jonathan Swift’s A Modest Proposal doesn’t work as a parody because people trusted that Swift didn’t really mean what he said (a good number of people actually did think he was serious), it works as a parody because the idea that children should be eaten to avoid being a burden on their family is clearly extremist and absurd.

The text of Amirkhani’s article is gender neutral but the context he establishes for the article is not – he seems to go out of his way to create the impression that he’s speaking specifically about women – and that’s the source of his undoing. It’s unfortunate that a well meaning article (and I do believe it was well-intended) should fail so catastrophically because the potential for parody and relevant social commentary is there. Had he omitted the Fake Gamer Girl meme – or at least balanced it with a Fake Gamer Boy meme (even if he had to invent it himself), avoided an inflammatory title that explicitly names women as the subject matter (simply swapping “Significant Other” for “Girlfriend” would be sufficient), and focused on a salient point he makes in the article’s introduction:

There are plenty more reasons why someone would pretend to be more interested in games than they actually are (maybe they just desperately want to share an interest with you)

the article would not have felt like it was jumping on the Gamer Girl Bashing Bandwagon and the underlying message – that judging someone’s worth as a gamer by how much they line up with what you believe a gamer “should” be – might not have been so painfully obscured.

Aug 24

User Experience is not Technology

I went a little dumb in the head and read the comments on a few sites reporting on the Apple v Samsung patent case today and it’s got me fuming. Let me be clear: I am technology agnostic. You should use whatever piece of technology aligns best with your needs functionally, emotionally, and philosophically, whether that means you stick with one vendor for everything or use a combination. I have no problem with people who love Apple products, but the sentiments I’m seeing in the pro-Apple comments are infuriating.

This case wasn’t about protecting IP. It was about protecting monopoly.

Claiming that Apple “deserved” to win the suit and that “developers should stop [trying to copy Apple] and try creating something new instead” is childish and ignorant. Yes, Apple deserves compensation for their innovation – doing something new and compelling that demonstrably pushes technology forward should absolutely be rewarded – but Apple didn’t “build the iPhone/iPad from the ground up” as many of the comments are claiming. They streamlined/improved technology that was already there, added new elements, and integrated the pieces to create a unique user experience that no one had seen before. The iPhone was innovative and revolutionary but it wasn’t created from thin air.

And do you know what else won’t be created from thin air? Everything that comes after the iPhone. Contrary to what people seem to think, technological innovation is, more often than not, about doing something new with what already exists and not about creating something new whole-cloth.

Technology patents should be about protecting your technology: you figure out how to do something clever, you patent your method, and when it becomes wildly popular, if your competition wants to do it too, they have to find a different way to achieve the same thing. It may turn out that their method isn’t as efficient, robust, or aesthetically pleasing as yours or they could end up with something better. The end result may be the same, but one experience will likely be considered superior because the underlying technology is superior.

Yes, that’s called copying. But it’s also called competition and it helps drive innovation.

And that’s why this lawsuit is problematic; it’s not about the technology, it’s about the end result:

Among the Apple-patented features Samsung devices were found to have been infringing upon: the “bounce-back” that happens when you scroll to the end of a list, double-tap zoom, pinch-to-zoom as well as the design and iconography on iPhones, iPads and iPod Touches. (source)

None of the things on that list are technology, they are user interface – or more precisely, UX (user experience) – elements. They define OS behaviors. The patent lawsuit wasn’t about how Samsung might implement those elements, it was about whether or not Samsung (or anyone else) should be allowed to implement them at all.

Put another way, Apple saying that no one should be able to include pinch-to-zoom functionality because they had it in the iPhone first is akin to Microsoft saying no OS should be allowed to use point-and-click menus because they had it in Windows first1. If you’re thinking that patenting point-and-click menus is ridiculous, well, you’re right. But so is patenting something like pinch-to-zoom. They’re both methods of interaction with an operating system, and they’re both considered pretty standard at this point.

Should Apple be compensated for getting there first? Absolutely. Whatever algorithms or code they are using to implement pinch-to-zoom, or any of the other functional elements, should be forbidden fruit to every other technology company. And they should go after anyone who uses their algorithm or code with a vengeance. But should they be able to sue the pants off someone who figures out a different (or possibly better) way to do the same thing? Absolutely not.

Much like point-and-click menus, the UX elements listed above are becoming expected mobile OS elements and users will be disinclined to purchase phones or tablets that don’t include them; they’ll want to stay with what’s familiar. Oh, and that bit about iconography? They’re talking about the rounded corners on application icons. Rounded Corners. So if you create an operating system that uses icons with rounded corners, you’re violating Apple’s patents.


That is not a reasonable patent.

When you take user experience elements that are very rapidly becoming standards for user interaction and tell developers they can’t use those elements – not the technology that created those elements, but the elements themselves – for 20 years, you’re not protecting your IP, you’re stifling competition and innovation.

Our patent system is supposed to be about protecting creators but it’s very rapidly becoming a tool to limit, or destroy outright, competition in a variety of sectors. That’s not something we should celebrate, no matter who wins a particular case.

1 Technically speaking, Microsoft didn’t invent clickable menus. Xerox’s Alto computer has that honor as the first computer to offer a graphical interface with clickable elements, circa 1973. But the analogy is still valid; as with iOS and pinch-to-zoom, you can argue that it was the popularity of Windows that helped standardize GUI menu behavior and helped set home computer user expectations when it came to how they interact with their computer. If you’re interested in learning more about the evolution of graphical interfaces, Toasty Tech has compiled a timeline that includes screenshots of the old operating systems.

Mar 22

Science > Common Sense

This started out as a quick Google+ post, but the more I wrote, the more I had to say and I thought it would work better as a short blog post. Earlier today I was struggling with some text at work so I took a break to scroll through my twitter feed and give my subconscious a chance to mung on the words that were giving me fits. As I scrolled through my timeline, I came across a tweet by Philip Bump (retweeted by @jonrog1):

I didn’t really want to click on the link. I had a pretty good idea of what I’d find, but I did anyway… train wrecks and all that jazz.

Most of the article just made me annoyed, disgusted, and depressed. But this line in particular made me angry:

Maybe because we have learned to be skeptical of ‘scientific’ claims, particularly those at war with our common sense

Santorum is basically saying that science is a scam, perpetrated by scientists, intended to pull the wool over the eyes of the rest of the population to the furtherance of their dogmatic ideologies. But “regular folks” are too smart for that kind of thing. They’ve got common sense and they can see right through those scheming scientists.


I know we often say that ignorance is bliss, but when did it become fashionable? When did how little you know about a given topic become the indicator that your perspective should be highly valued?

Contrary to what Santorum thinks, science is not “ideology”.

Yes, there is debate in science.

And, yes, there are scientists who lose sight of their responsibility as stewards of discovery and become comfortable with what they know. But guess what? That happens in EVERY field, not just science. And it happens because they are human; as we age, many of us tend to become comfortable not only with what we do know, but with what we don’t know and we stop thinking outside-the-box (to borrow a phrase) not so much because we’re okay with the limitations of the box but because we’ve forgotten the box is there.

This is okay for humans (although it can cause generational clashes in fields where change happens rapidly). This is not okay for science. Which is why science is rife with new theories.

All the time.

We don’t see these theories in schools because only the theories that have passed the most rigor and have the largest buy-in from the scientific community make it into textbooks; the level of associated scrutiny means the theories have merit. It means at this moment in time those theories represent science’s best guess for how things work. It’s the rigors of the scientific method and peer review, the regular integration of new ideas, and the unification of cross-disciplinary fields that makes science powerful.

Teachers should absolutely include the fact that evolution is still considered a theory. Case in point: human evolutionary theory got a huge kick in the rear around 2009 when a skeleton, dubbed Ardi, was discovered. It’s over 1 million years older than any previous skeleton and has shaken some of the founding principles in human evolution (namely the idea of a missing link).

But discoveries like this don’t mean evolution, as a whole, is wrong. It means we don’t understand some parts of it as well as we thought we did. There may be times where finding new information completely destroys an old theory, but more often than not the new information just helps to refine it.

When it comes to explaining how life adapts to changes, evolution is still the best theory we have that:

    • can be backed up by physical evidence,
    • holds up under peer review, and
    • can be reproduced in the lab.

Maybe, one day, someone will come along and up-end everything we think we know about how evolution works and a new, stronger, better theory will take its place. But that hasn’t happened yet.

That’s not a weakness of science.
That is its strength.

The ability to assimilate new information.
The ability to modify theories and change ideas based on verifiable evidence.
The ability to engage in healthy discussion and debate with peers.

Even beyond the need to provide people with a basic understanding of the world, these are the reasons why we teach science in school. And it’s why we don’t teach “common sense”. Common sense can’t be argued with or peer reviewed because it’s not based on evidence.

Common sense is called “common” because it is widely believed not because there’s any inherent value in it. The pervasive (and growing) belief that the average person on the street somehow has more innate understanding of a complex topic than someone who has spent their life studying it – whether that subject is science, construction, or dancing – is mind boggling to me.

This attitude (along with basic technological ignorance) is part of what made people willing to sign off on SOPA even though they didn’t understand it: their “common sense” told them the nerds1 were overreacting.

Limiting a woman’s right to vote used to be “common sense”, should we bring that back too?

You know what?
Don’t answer that.

Given all the recent attacks on women’s health, I’m pretty sure I already know the answer.

1during the SOPA amendment hearings the term “experts” was very quickly cast aside and the pro-SOPA contingent insisted on referring to Networking and Software Engineers, Tech Company representatives, and industry founders as “nerds”. Once I can find a cached copy of the hearings, I will link to some of the relevant moments.

Feb 07

GP52 Week 04 – Up


Gravity pulls at the skull
and tugs at the spine,
anchoring our footfalls
to the dusty face
of a spinning rock.

It snatches at our heels
and grabs tight to our legs
reminding us that we have:

Some call it a law.
Others argue it’s a theory.
No matter the label,
it rules our lives
providing confidence
that as we move forward,
each step will follow the last.

The wind thinks differently.
Movement is change.
Change can bring the impossible.
It rushes upward,
pushing against the blue.

And as they battle –
You are shocked to discover:
even this rule
is made to be broken.

the inspiration for this photo came from the Levitating Girl photos by Natsumi Hayashi; you can find more of her work on her website or Facebook page

the photo in this post was taken by Jan Chavis; you can find more of her work here or here

Jan 31

GP52 Week 03 – The Rule of Thirds


Diced in nine and
edged eight ways –
a seven-fold image
to spark the sixth sense
and ignite the other five.
Four points
from three slices
center two eyes
on one figure.
Framed in math and
composed by numbers,
a great shaggy cow
stands alone in the field
and keeps company
with the blossoms.

the photo in this post was taken by Jan Chavis; you can find more of her work here or here

Jan 24

GP52 Week 02 – Summer/Winter

Winter's War
Winter’s War

You’re sneezing and you’re stuffully
and ache from bone to skin.
The winter night is cold without,
and you are cold within.

You build yourself a quilted fort
and pour a cup of tea –
you know to quickly win this war
good weaponry is key.

Chemical armaments consumed
you burrow in your keep
as tiny armies deep inside
do battle while you sleep.

the photo in this post was taken by Jan Chavis; you can find more of her work here or here

Jan 18

A Brief Comment on SOPA

Today’s been an interesting ‘net news day. Lots of people commenting on SOPA (and to a lesser extent, PIPA) but I’ve also seen comments on FB/Twitter/G+ implying the reaction to SOPA is somehow… inappropriate or repulsive. That this level of reaction should be reserved for more “important/serious” issues like gang violence, the homeless epidemic, health care, education, etc. I mean no disrespect, and everyone is entitled to their opinion, but I firmly believe the people making those arguments don’t actually understand the scope of what SOPA/PIPA would do.

The idea that legislation which would effectively revoke one of the fundamental rights on which this country was founded, i.e., free speech, could be considered neither “serious” nor “important” is simultaneously frightening and mind-boggling to me. Even if I can somehow manage to put that aside… and I really, REALLY can’t… the fact that this legislation effectively allows large, established companies to shut down rival sites indefinitely (without a trial!) – thereby destroying their livelihood – should, as my father would say, un-kink the colon of every small business owner, entrepreneur, and anyone who cares about them including family, friends, and would-be employees looking for a job in this shit economy.

To those who have compared SOPA to some of life’s other evils: yes, you are right.

Children going without enough to eat in this supposed land-of-plenty is horrible.
Children being shot where they stand as collateral damage in gang wars because they were in the wrong place at the wrong time is horrible and tragic.
The pathetic state of our education and healthcare systems is horrible.

These are all Big Fucking Deals.

But, guess what?

So is SOPA.
And so is PIPA.

These bills have the power to fundamentally change our way of life by making censorship and suppression of free speech a web standard. It would be as ubiquitous as HTML, HTTP, and photos of cats. It would fundamentally break the Internet’s infrastructure and destroy innovation because every time something new came along that threatened the status quo, those who were threatened would have the power to stifle it. That means among other things:

No (or very little) new technology.
Fewer new jobs.
Crippled entrepreneurship.

And did I mention: A LEGAL JUSTIFICATION FOR SUPPRESSING FREE SPEECH! That’s an honest-to-God Constitutional right being violated, people, not an extrapolation/interpretation of what the Founding Fathers meant by this-or-that.

I know that we all have our pet issues – those things that really make our blood boil and make us exclaim, “why aren’t people taking this seriously”.

But that doesn’t make it okay.

Understand what you’re talking about before you dismiss an issue just because you feel something else is “more important”. Ignorance is not an excuse; it is a weakness.

Jan 03

GP52 Week 01 – New Year’s Eve/Day

Week 01 - New Year's Eve/Day

New Year’s Day

The optimists cheer:
“Eat black eyed peas! They will bring you luck.”
The pessimists chastise:
“There is no magic today! It’s the same as yesterday.”

The pessimists are right.
Turning the calendar
doesn’t undo the past
and erase our mistakes
or relieve us of our responsibilities –
   yesterday’s debts are still unpaid
   yesterday’s problems are still unsolved
   yesterday’s failings are still unconquered

The pessimists are wrong.
Often the difference
between feeling trapped
and feeling free
is having an excuse to believe –
   that change can be made
   that solutions can be found
   that forgiveness can be earned

There is power in that.

Embrace it.
Claim it.
Snatch it up
and wring the magic from it until
good luck spills out
like foam from a champagne glass.

the photo in this post was taken by Jan Chavis; you can find more of her work here or here

Older posts «