When It Feels Like There’s Nothing Left to Be Written

Nathan Bransford gets it right. Again:

Hans_III_Jordaens_001aby Nathan Bransford

There’s a fantastic moment in the movie The Truman Show where young Truman tells his teacher that he wants to be an explorer like the great Magellan. His teacher pulls down a map and says cheerfully, “Oh, you’re too late! There’s nothing left to explore!”

It can sometimes feel this way when writing too. There are hundreds of thousands of books out there. Every genre feels well-worn. We have the voices of hundreds of writers swimming around our heads.

How can we stand out from the pack? How are we going to get someone to read our book instead of all the others ones? What’s going to make ours different and better?

Writers are often their own worst enemies in this regard. The type of person who will eventually write a successful novel is adept at spotting their own flaws, and mistakes are plentiful at the beginning of the novel-writing process.

What often stops would-be writers in their tracks is that their first efforts aren’t very good. And they know it. The voice sounds like another author’s voice, the plot feels like an imitation of a book they’ve already read, and it doesn’t start out feeling particularly original.

Read it all

Neil Gaiman: “The Best Way to Come Up With New Ideas is to Get Really Bored.”

When Neil Gaiman writes, people read. And when he talks, people listen. So what do we call it when we read what he’s said?

“People risten?”

“People lead?”

FWIW, we’re leading the ristening right here:

Neil Gaiman

Neil Gaiman prepares for social media ‘sabbatical’

by Richard Lea

Fans of Neil Gaiman can anticipate an empty January 2014, when the writer is set to take a “sabbatical” from social media.
Speaking at the Guardian, where the author spent a day editing the books website, Gaiman announced that he would take a break from updating his 1.8m followers on Twitter, his 500,000 Facebook friends and maybe even posting for the 1.5m readers of his blog.

“I’ll be taking about six months off,” he said, “a sabbatical from social media so I can concentrate on my day job: making things up.”

There has been little sign that the output of the creator of The Sandman and American Gods has slowed since he took up blogging in 2001 or since he joined Twitter in 2008, in which time he has published award-winning novels such as Coraline in 2002, The Graveyard Book in 2009 and now The Ocean at the End of the Lane, out next week. He has also written two episodes of Dr Who.

Gaiman thanks his Twitter followers in his latest novel for helping him check the prices of sweets in the 1960s but confesses that he would have “written the book twice as fast” without them.

He says the problem isn’t the amount of time spent using social media; it’s how it spreads into every cranny of our existence.

“People ask me where I get my ideas from,” he said, “and the answer is that the best way to come up with new ideas is to get really bored.”

Watching school plays was ideal, he continued.

Read it all

Herbie J Pilato: Don’t Tug On “Superman’s” Cape – Or “Batman’s” Either

by Herbie J Pilato

SupermanFor the solid success of any creative property – whether it be for television, film, the stage, new media, or the printed form – it’s all about the writing; getting the story right (write!) and flushing out the proper development of the characters.

When it comes to the superhero genre, in particular, attaining the proper casting and wardrobe (i.e the costume) plays heavily into the creative process in very real, tangible and pertinent ways.

Disney/Marvel have hit the nail on the head with Joss Whedon’s The Avengers, if not with the X-Men film franchise, which was ignited by Bryan Singer (The First Class film featured atrocious casting and acting, while the earlier X-Men destroyed the colorful costumes, displayed so wonderfully in the comics and the animated TV series from the early 1990s.)

I wish to the heavens that DC Comics/Warner Bros., proprietors of the Superman andBatman franchises, would get that.

But unfortunately they keep missing the mark.

And I just keep scratching my head as to why that is the case.

What exactly is the problem – especially with the casting, which should be a no-brainer?

Only tall, dark, handsome and broad-shouldered actors need apply for the Clark Kent/Superman and Bruce Wayne/Batman characters.

And not just tall actors – but actors who are at least 6’ 5” in stature (minus their Super or Bat boots).

And not just actors with “okay shoulders” – but rather those actors with way-larger than average shoulders.

Yes, Henry Cavill, the star of the new Man of Steel, is a nice-looking guy and yes, Brandon Routh, of Superman Returns (also produced – and directed – by Bryan Singer) nicely-resembled the late, great Christopher Reeve (who will, to some, always be considered the definitive Superman).

But neither Cavill nor Routh are larger-than-life handsome.

As to the Bat cavalcade of stars: Christian Bale was wonderfully brooding in the Dark Knight trilogy. Michael Keaton delivered a nifty “everyman” spin as Bruce Wayne in the earlier round of B-man films. And George Clooney, if Gary-Grant handsome, and Val Kilmer, just didn’t measure up as The Batman (or whatever they’re calling him these days, with or without the “The”).

But again…none of these actors possess the super-spectacularly-handsome looks that are required for the Super or Bat roles.

I’ve yet to see anything close to the overtly charismatic likes of a Christopher Reeve or George Reeves (from the classic Superman TV series), for that matter.

And although Henry Cavill was relatively unknown before he was cast as the new Man of Steel – and casting an unfamiliar face is always the best way to go with these types of franchises (as not to confuse star power with character draw), he just doesn’t have the presence.

And although the new Man of Steel is making a bazillion dollars at the box-office success, I additionally wish that someone in the DC/Warner Bros. feature film division would please wise-up and start hiring the genius writing team from the animated Batman,Superman and Justice League series to scribe the next live-action Superman (or Wonder Woman) movie – and certainly the live-actionJustice League feature film itself. (Those guys know their stuff!)

ISuperman_vs_Batman3nto this mix, just imagine if unknown, super-handsome actors were cast in the B and leads? And just imagine if the assigned writing dream-team would make certain to incorporate the proper costume designs (from the original comic book visions – instead of messing with some kind of newfangled look)?

EVERYONE would be pleased – on so many levels.

To reiterate: in ANY creative venue – with any creative property, be it for the big screen or small, Off-Broadway or right smack on top of it, writing into the script the proper casting and costume is just as crucial as developing the story and characters. And when it comes to remaking or representing Superman and Batman in particular, two of the most popular superheroes and media/pop-culture icons of all time, well, ya’ just can’t mess around.

he Dark Knight trilogy was a success because it was different, and it made sense (there was indeed a good story and great character development). And the Man of Steel flick is fast becoming a success because the audience might be desensitized to or just may have plain given up on the DC/WB powers-that-be to get it right. The Super-movie-goers might just now be figuring, “Well… it’s better than nothing.”

Meanwhile, too, the studio may just not have the guts to admit they were wrong (again!) with blowing another Superman film (especially one that cost as much to make and promote as Man of Steel).

Either way, I’m reminded once more of The Brady Bunch. That’s right: The Brady Bunch (see Why The Brady Bunch Is Still The Bestpost)….how Barry Williams’ Greg Brady, of all characters, was hired by certain unscrupulous entities because he “fit the suit” as Johnny Bravo in that famous fifth-season opener of The Brady Bunch.

Fortunately, at least Greg had the integrity to walk away from the shallow, singular success that was offered him with the false bravado of the fabricated Johnny Bravo, and instead returned to his family’s more-sincerely rooted musical brother-sister group.

As such, when it comes to getting it right with the creative casting and costuming of Superman and Batman, it looks like WB/DC group should take an applicable lesson from the G.B. troupe…and stick to the original larger-than-life but loyal comic book renderings.

The Right Mindset for Creativity

Everything we need to know about getting our head in the right place so we can, you know, make up stuff…and turn it into that lovely little thing called art. (Well, it wouldn’t sound nearly as cool if we said “that lovely little thing called product,” would it?

mindset

by Dr. Heidi Grant Halvorson

In his wildly popular 2006 TED talk, Sir Ken Robinson defined creativity as “the process of having original ideas that have value.” Aside from being wonderfully succinct, this definition implies that any creative enterprise requires two key phases:

Phase 1: Coming up with an original idea

Phase 2: Taking a hard look at that original idea and assessing its values

So to be a successful creative, you need to not only be a good generator, but also a good evaluator. The problem is that in practice, it’s remarkably hard to be both. And the reason for that has everything to do with your motivational focus – how you think about the goal you are pursuing when working on a creative project. One kind of focus heightens your creativity, while a different focus gives you the analytical tools you need to assess your work. The good news is that you can actually shift yourself from one focus to the other in order to bring your best game during each phase of the creative process.

When you see your goal as an opportunity to advance – to gain something, or to end up better off – you have what psychologists call a promotion focus.

Read it all

TV Ain’t What It Used to be…And It’s Gonna Be Even Better

Ars Technica presents the best look at television’s creative and technical synergy that we’ve ever seen. Don’t just read this, memorize it!

tv-storm-tower

The Trajectory of Television—Internet rebellion and hardware renaissance
by Casey Johnston

Though television has existed for well under a century, its mark on culture and society is indelible and undeniable. Last week, we described how TV got to where it is today: by traveling a winding road of antennas, black and white broadcasts, news and game shows, broadcast formats, reels, and remotes.

These days, TV as we’ve known it is facing a confusing time. After a long, comfortable, monogamous relationship between cable and satellite providers and the living room, the Internet has come at the concept of TV in full force. A rush of power has been handed back to the consumer.

Unease pervades most of what is happening with regard to video content, from the hardware we use to watch it to the services we pay to access it—if we pay for services at all. The power balance has been disrupted, and there is no clear resolution in sight.

The screens, in a slow fade

After circling the rows of glowing HDTVs and their massive price tags for years, customers finally started to buy into the new HD sets in droves seven or eight years ago. Today homes are saturated with them, and we can’t get them big enough.

Plasma TVs reigned in quality for a long time, especially at larger sizes. This was largely because it was impossible to make an affordable LCD much bigger than 40 inches. Now plasma is fading into the background a bit as increasingly massive LED-powered LCD TVs take over stands, walls, and shelves.

Though large LCD screens have become feasible, they aren’t growing heavier. The LED construction may carry a significant price premium over standard LCDs, but it allows the panels to be remarkably thin, to the tune of a couple of inches.

This isn’t to say plasma doesn’t still have its place, although negative impressions from burn-in trailed it for many years (in modern sets, this is mostly a non-issue). Certain plasma sets can still achieve better color balance and black levels than many LCDs, and plasma prices have come down over the years.

OLED TVs are likely the next big thing on the hardware horizon. At present, they’re in the price range of most advanced medical procedures—LG’s 55-inch model debuted for $12,000 in January. But with those hefty price tags come unprecedented thinness. OLEDs (organic light emitting diodes) work by emitting light when a current is passed through them, obviating the need for a backlight. Each pixel in an OLED screen is a diode. Because achieving black is as simple as turning the right pixels off completely rather than selectively dimming the backlight, an OLED can achieve black much more easily than an LCD or LED TV can.

OLED screens are also more flexible, which gives us sets like the weird curved panel from Samsung we saw at CES. It’s an attempt to achieve a pseudo-IMAX-like effect. While it doesn’t work completely on this scale, massive, curved screens (i.e., one continuous IMAX screen) could be technically feasible.

We can’t talk about where TVs are now without touching on the relative abomination that has been 3D TV. Consumer 3D TVs started to enter the market around three years ago to some initial excitement. But even as the sets progressed from requiring viewers to wear glasses to 3D effects you could see without headgear, the feature has failed. It hasn’t created a renaissance of compelling content; it hasn’t convinced anyone it’s an essential feature. 3D remains a gimmick.

That said, 3D comes standard on many TVs now. If customers are pursuing a new set, they are likely to get it whether they want it or not. It’s hard to see the feature as anything more than an add-on that was an attempt to revive TV sales and drive customers back into stores once the surge of HD sales died down (that it requires a whole new type of content helps TV and movie creators, too). 3D didn’t capture much attention, but cranking up the resolution again may reignite buying interest where 3D failed.

TVs are about to see their second significant resolution bump in history, from “HD” (720-1080p) to “Ultra HD” (4K-8K). 4K, which is actually 2160p, and 8K, which is 4320p, came out in force at CES this past year, with many a demo reel showing off extra-crisp content on the high-resolution screens.

At present, these sets still suffer from the same issue that HD did at its outset over a decade ago: they look great, but there’s no content to put on them. A few companies have stepped up to offer partnerships for 4K content, but it’s hardly a fast-moving trend. 4K will be a particularly difficult hurdle for streaming content, in terms of working against connection speeds and the few bandwidth caps that have managed to make it into certain service contracts.

But even as TVs have tightened their act and slimmed down like every nerd transformation preceding a high school reunion, our consciousness is turning slightly away. This is thanks to the Internet and the more natural home of Internet programming: our computers.

Read it all