Reading JD Lasica’s Darknet, about the clash between digital remix culture and Hollywood’s attempt to lock down content.
In one chapter, Lasica tries to clear rights to snippets of old movies for use in a personal, non-commercial video — the Mummy, Ice Age, Mary Poppins, Treasures of the Sierra Madre — and collects the rejection letters from the major studios.
What a missed opportunity. The studios should permit this as a matter of course. And they should require the inclusion of a little “credits” widget that has a link to rent or buy the whole movie. Little snippets of video have the ability to evoke the memories of the rainy Sunday when the movie was seen first, and the impulse to watch the whole thing.
What better way to stimulate sales of back-catalog content? No cost, and found money. The rejection letters are opposite of direct mail — targeted anti-marketing, designed to repel buying opportunities from primed and eager buyers.
How long will it take “long tail business opportunities” to really hack business models and buy back the law?
Archive for May, 2005
Reading JD Lasica’s Darknet, about the clash between digital remix culture and Hollywood’s attempt to lock down content.
Network broadcasting has fallen behind cable tv in audience share. Broadcasters are supposed to switch over to HDTV by the end of 2006, but broadcasters saythey’re not ready and customers are confused.
Instead of switching over to broadcast HDTV, will customers just abandon broadcast for cable, DVD, and emerging internet video? Will the confusion about HDTV hasten the decline of on-air broadcast?
The recording industry switched from LPs to CDs to mp3s without the government having to pass a law. The wireless market is migrating from 802.11b to g, and may get to WIMAX without a federally mandated transition. The requirement for federally regulated standards for on-air broadcast formats seems like a competitive disadvantage for for on-air broadcasting.
File this under “insufficiently informed speculation” — I don’t know enough about this market to have a good opinion. I don’t watch much tv, so indifference to the glories of HDTV maybe clouding my judgement.
According to this law.com article, Innovation and its Discontents has helped to spur the current drive for patent reform.
Legislators were also spurred into action by a book — “Innovation and Its Discontents: How Our Broken Patent System is Endangering Innovation and Progress, and What to Do About It” — published last year by business professors Adam Jaffe and Josh Lerner.
Stephen Fox, Hewlett-Packard’s deputy general counsel of IP, noted at a conference in San Francisco on Wednesday that members of Congress have been reading it and even marking particular pages.”They’re using it to get a perspective into the patent system,” Fox said. It’s given them “an aha moment — that’s what it’s all about.”
I caught Chris Anderson giving a talk on The Long Tail last week. The most interesting part of the talk (for folks who’ve read the Wired Article was the as-yet-unpublished research data showing using various data sets that prove that Long Tail businesses really do shift revenue mix from almost all hits, to a 50/50 mix of hits and niche products.
Several follow-on reflections. As revenue in the entertainment industry gravitates to the Long Tail, how long will it take for Amazon, Netflix, and others to start buying back the law from mass media capture?
A mass media, hit-based business sees its value as selling the same product to as many people as possible. Anything that modifies the product, or seems to displace a sale to an individual is seen as harmful. A “long tail” business sees its value as fostering many revenue-generating niches. Therefore, anything that fosters the creation of new niches and subcommunities is seen as beneficial.
There are at least three legal preferences that are purchased by hit-based companies, and would be modified by “long-tail” companies:
* Long copyright. A “long-tail” business would also be interested in clearing rights on old, back-catalog works for renewed distribution
* Criminalized sharing. A “hit-based” business sees sharing as stealing. A “long-tail” business sees sharing as the building of niches. See Yahoo Music and Grouper for examples in this direction.
* Criminalized remix. A “hit-based” business sees fan-community modifications as stealing. A “long tail” business sees fan community mods as the building of niches, and finds ways to make more money from enthousiastic and creative fans.
How and why is the US patent system so broken? News stories about dubious patents generate grumbling, annoyance, frustration, and perplexity.
An exceptionally good book, Innovation and Its Discontents explains what’s wrong with the patent system and how to fix it. Written by two economics professors at Harvard, Adam Jaffe and Josh Lerner, the book is short, clear, well-argued, and wears its erudition lightly.
Things haven’t always been this bad. But in the 1980s and 1990s, two separate reforms — of the patent courts and the patent office — combined for a pernicious result. Bad patents became much easier to get, and harder to overturn.
In 1982, the patent appeals court system was consolidated from 12 regional courts, which had vastly uneven standards, to one centralized court. The reform halted the practice of “forum-shopping”, whereby patent-owners rushed to accuse infringers in patent-friendly courts, while challengers rushed to seek hearing in patent-friendly courts.
The practices of this centralized court made it much easier to sue for patent infringement and win. The percentage of patents upheld increased from 62% to 90% in the few years after the central court started.
A few years later, in the mid-90s, the Patent Office changed from a tax-supported agency, whose mission was to ensure that patents are valid, to a fee-for-service agency, whose mission was to quickly issue patents to those who apply. The fees from the Patent office are siphoned into the general federal budget, while the office can’t keep qualified staff. 55% of patent examiners have less than two years of experience.
The result is that bad patents sneak through without good scrutiny. The average patent claim is reviewed for only 16-20 hours, which is half the time spent in the European Union. In the time available, patent examiners look for information that is easiest for them to find — other patents in patent databases. They don’t have the time or experience to look for other sources — like existing software and academic research — that prove that the “invention” is obvious, or not new.
Meanwhile, the patent review process is mostly closed — there isn’t a good way for third parties to share relevant information about prior art until after the patent is granted. Once the patent is granted, the legal system presumes that a patent is valid, and stacks the deck against attempts to overturn a patent.
A reform in 1999 was intended to create a “reexamination process”, but the process was watered down so badly that it is almost never used. The only kind of evidence that a challenger can present is other patents (not pre-existing software, evidence of historical business practices, or academic papers). The challenger doesn’t have the opportunity to explain the evidence. If a challenger applies for a patent re-examination and loses, they lose the right to sue later.
As a result, a lot of bad patents get issued, and they are very hard to protest or overturn. Technology companies use patents to gain license fees from competitors, who will settle rather than go to court, even if the patent is bad, because an infringement allegation is too costly and risky to defend. Large competitors create cross-license patent libraries that maintain the advantage of the leaders, and freeze out smaller players.
So how can the system be improved? Jaffe and Lerner recommend a tiered approval and review process, where patents can be issued quickly, but there are several stages where challengers and third parties can submit prior art and try to prove that the patent is obvious or not new. They also recommend reduced use of juries, who lack expertise to evaluate the information.
The book has interesting observations about the failure of patent reform efforts in the 90s. Talk show celebrities including Oliver North and G. Gordon Liddy used the issue to grandstand against Japan, who were competing against US manufacturers. Patent lawyers, who gain from the current system, were well-organized. At the time, the technology industry was not well-organized, and there was little public interest in patent reform.
Thanks to Doug Barnes for recommending the book, which joins my short list of favorite non-fiction. It takes a puzzling and potentially abstruse subject, and explains it clearly. It uses stories and well-chosen research data to make its points. And it shows a potential exit for the tangled mess of the US patent system.
Patent reform is in the works again in Congress. The book is very helpful context for the debate.
When a region wants to make a decision about whether to provide broadband as a publicly supported service, what better way than to put the matter to a vote?
Here’s what happened last year in an Illinois referendum about deploying a regional fiber network, according to BroadbandReports.com
SBC spent $192,324 on defeating the ballot measure, while Comcast spent $89,740. Fiber for our Future, the community group pushing the initiative, spent $4,325. Not months after the first vote failed, the Illinois area in question saw Comcast rate hikes as high as 33% in some neighborhoods.
Something is seriously wrong with democracy when this can happen. It is old-fashioned to call for free campaign airtime, but I can’t think of a better way to actually rescue democracy from corporate purchase.
I was filling out the registration form for the Nashville City Paper and the Dayton Tribune this morning. And I realized that newspapers don’t yet understand an opportunity that Google News gives them.
Like many newspapers, they have added forms that require readers to register. The forms asks a question about newspaper readership habits, with choices ranging from “subscribe” to “pick up the paper at the newsstand” to “rarely buy the paper.” None of the choices fit the profile of a reader who picks up interesting stories from around the world on Google News.
I wonder how how noticeable the “out-of-town” visitor segment is growing. It could present opportunities for different types of ads (national) or different types of subscriptions (a bundle to a very large basket of papers). I’m curious about how much new circulation Google is throwing at newspapers, how it can be an economic opportunity for the content creators (it should be, when readers click through).
And I wonder how it feeds into the economic equation adding up to death of newspapers based on stats that show younger people getting news online.
As reported by Scoble, Microsoft is again in favor of the bill that bans discrimination against gay people in housing employment. (though it’s too late for the Washington legislative session this year).
The reversal was prompted by widespread complaints by Microsoft employees and media coverage, after the story was broken by The Stranger, an alternative newspaper in Seattle, and John Aravosis of AmericaBlog. So much for the lobbyists’s boast that nobody would notice.
In a memo posted on Channel 9, Microsoft’s online forum, CEO Steve Ballmer explained why Microsoft is taking a position on a public policy question:
Chris Dent adds purple numbers and paragraphs to Jason Kottke’s list of little things that are getting permanent addresses on the web. I think that’s right in some very interesting ways that are waiting for experiments and experience to show.
I have one big question about the usefulness of purple numbers that perhaps people who have worked with them can answer.
When I am editing, paragraphs are among the most malleable of units. Groups of a few sentences are combined to form larger paragraphs. Large paragraphs are split into smaller paragraphs. A few sentences from one paragraph are cut and moved to a different paragraph.
* purple references to an early draft will be very different from their referents in a later draft.
* the sequence will be garbled
* some references will be missing
So, perhaps purple numbers are only useful for final drafts — like a reviewed and published scientific paper.
But then, what about writing in wiki. When a wiki is used as a canonical writing tool, the content is malleable all the time. How confident can linkers be about the stability of a referent?
By contrast, a wiki page or a blog post is pretty stable. The content might change a lot (by the conventions and affordances of wiki) or a little (by the conventions and affordances of blog). But the topic is probably the same.
A del.icio.us link entry or a Flickr photograph is stable, although the description and tags may change.
In practice, are purple numbers stable enough to be useful? Or are there certain cases where they are more useful than others?
The Economist had a few encouraging stories about a new way of financing the environmental restoration of forests and wetlands.
The Panama Canal will suffer from floods and silting because the hills upstream are being deforested. So a forestry insurance company is seeking to issue a 25-year bond to pay for the forest to be replanted. The Economist lists a number of other examples of “downstream long bonds”. For example, New York City uses similar financing to keep the Catskill feeder streams clean, to protect the City’s water supply.
Of course, the “downstream long bond” solution only works when there is an identifiable, deep-pocketed downstream buyer. When the “buyers” are spread out — like citizens harmed by polluted air — the government regulation is probably needed to assert the collective demand of those who value clean air.
When I read Natural Capitalism a few years ago, it struck me that the lack of financing was one of the major obstacles to sustainable business practices — and that there were interesting opportunities for new financial instruments.
Another good sign is the rise of VC investments in clean energy from $509 million in 2003 to $520 million in 2004 — though this is still small potatoes for the $20B US VC market.
The sheer presence of a market research firm doing segmentation and forecasting of industry revenues and VC investment is a good sign of an emerging market.