hacker news with inline top comments    .. more ..    15 Jul 2012 Best
home   ask   best   6 years ago   
How Amazon's ambitious new push for same-day delivery will destroy local retail slate.com
499 points by rmason  3 days ago   308 comments top 53
wheels 3 days ago  replies      
Amazon has been doing this in parts of Germany since 2009. The results haven't been nearly as dramatic as the article predicts. As others have indicated, the limiting factor isn't so much speed of delivery as the inclination to physically inspect items before buying them. That said, Zappos, now a part of Amazon, has succeeded in doing that in one of the markets that tends the most towards such. Even so, there are still a lot of places to buy shoes.
CitizenKane 2 days ago 2 replies      
This is already happening in China. Because of the nature of shipping companies here it's not unusual to buy a product (typically off of http://taobao.com) and get it later that day or the next day. Many of these businesses are based in Shanghai and if you live there delivery happens nearly instantly. One online shop, http://cheers-in.com/ delivers cold beer in Shanghai in 1 - 2 hours from an order. Stuff like this is absolutely fantastic and it would be amazing to see it come to the US.
fsckin 3 days ago 4 replies      
Amazon failed to collect/remit ~270M in sales tax in Texas and has owed millions for years. So what did the state of Texas do about it?

They struck a deal to erase the 270M owed, as long as Amazon starts collecting sales tax starting 7/1, create 2000 jobs and invest 200M in Texas. They were likely already planning more infrastructure in Texas, but threatened to reverse course and pull out of the state entirely.

The way I see it, Amazon bluffed the state of Texas to the tune of 270 million dollars and all I might get out of it is next day delivery?

I'll take what I can get, I guess.

Caerus 3 days ago 6 replies      
> Amazon has long enjoyed an unbeatable price advantage over its physical rivals. When I buy a $1,000 laptop from Wal-Mart, the company is required to collect local sales tax from me, so I pay almost $1,100 at checkout.

This is such an over blown argument. Sure, Amazon is ~7% (where I live) cheaper than traditional stores due to sales tax. But that $1000 Wal-Mart laptop has a $900 sticker price on Amazon, and most non electronic items are 20-40% cheaper than in stores.

If laws change and I have to start paying sales tax on Amazon, it won't change a thing about my buying habits.

Edit: They also have an inventory many times larger than any brick and mortar store. Whenever I go shopping, I have to choose between the least crappy option Wal-Mart decides to stock. On Amazon, I get exactly the one I want.

api 3 days ago 4 replies      
More technology driven hyper-deflation on the way.

The future: high inflation in food, energy, fuel, and consumables, hyper-deflation in everything else except to the extent that it depends on or consumes the former.

PaulHoule 3 days ago 3 replies      
A general policy that "online stores pay sales tax" also benefits Amazon vs. small internet retailers.

A while ago I sold a few bumper stickers online and ended up using cafepress. I could have made a better profit by printing the stickers in bulk and mailing them to people, but I'd have to spend $100+ on paperwork just for the privilege of paying sales tax just in case I sell any to New Yorkers.

If small internet businesses had to pay taxes to the 40+ states that have sales tax plus to all the other jurisdictions (cites, counties, who knows what) in the U.S. it would be almost impossible to sell stuff and comply with the the law.

For AMZN the overhead is nothing.

tomfakes 3 days ago 3 replies      
I bought 2 things on Amazon this week, both with free 2 day shipping with Prime

Item 1 spent 13 hours on a UPS truck driving around my city and was delivered about 7pm in the evening 2 days after ordering.

Item 2 was delivered 14 hours after purchasing by Amazon Fresh at 9am.

For the same price of shipping, which service would you rather have?

EDIT To Add: The delivery guy for the 14 hour item works for Amazon - the whole experience was produced by Amazon without needing a third party. UPS is another company that will be in trouble if Amazon can make this scale.

mthoms 3 days ago 1 reply      
This is the trojan horse. Once Amazon has a large presence in each major centre, the next logical step for them is use some of their massive space as a showroom (think Ikea but on a much larger scale).

[Edit: To clarify, the showroom and warehouse would be in the same complex but separated. Shoppers + heavy merchandise + fast moving robots is a recipe for disaster.]

Then they can satisfy both the "I want to see it before I buy it" crowd as well as the "I know what I want - just give me the best price" crowd.

Mark my words. Amazon has the Costco's, Walmart's and Best Buy's of the world squarely in their crosshairs.

*Of course there will of course always be specialty categories that are too niche to fit in this model, thus many specialized retailers will still exist.

andyl 3 days ago 0 replies      
Retailers pushed to impose the sales tax on Amazon. Now it looks like they are going to get what they asked for.
JoeCortopassi 3 days ago 5 replies      
Isn't this just one step away from Amazon just being another brick-and-mortar? If this is the case, is not having an actual store that is accessible by customers (and coincidentally, the neccessary staff), that much of a operational advantage?

Or is this just a case of the more efficient company (Amazon) beating out less efficient companies (Best Buy, Barnes and Nobles, etc...)?

dr_ 3 days ago 1 reply      
"I have no idea how Amazon made any money on my order (the whole bill was less than $30) but several people on Twitter told me that they've experienced similarly delightful service."

Therein lies the problem. At some point there will have to be revenues to justify the company's valuation. I suppose their goal is to initially obliterate all competition in entirety and then have everyone purchase from Amazon. I'm doubtful this will work. Of late, there has been a trend towards experience stores - with manufacturers creating their own stores instead of distributing to retailers. Many luxury brands do this and even some non-luxe ones, such as Samsonite, have been getting into the game. There's some value added here, and it's something Amazon won't be able to directly compete with.

ddt 3 days ago 5 replies      
It'll destroy larger B&M stores but I doubt it'll fully outdo local specialty shops. What you'll see is a stratification between hyper specialty B&M that sell luxury items only a tiny subset of people want, but are willing to pay out the nose for, and places like Amazon filling the role of Target and Walmart, being the catch-all for everything else that most people need on a week-to-week or month-to-month basis. While you might be able to buy a certain brand of organic mustache wax on Amazon, I don't see a day coming where they'll be able to do that same-day.
stretchwithme 2 days ago 1 reply      
Because the last mile is always so expensive and subject to theft, I think we'll eventually have a centralized local pickup location for all sorts of small deliveries. We'll just stop there on the way home if something has arrived.

That will even include your postal mail if the postal service ever wakes up. There's no need to physically deliver mail every day if people could see what mail they've received remotely and can pick it up or request deliver if they can't leave home.

dkrich 2 days ago 0 replies      
"I'd gotten next-day Saturday service for free. I have no idea how Amazon made any money on my order (the whole bill was less than $30) but several people on Twitter told me that they've experienced similarly delightful service."

I've got news for you- they didn't make any money on your order. Amazon is profitable but has razor-thin margins on the whole. They take losses on several markets, including Prime, gambling that the entire seamless, low-cost user experience will cause enough people to spend enough money to push them over the edge of profitability.

I for one have serious doubts about whether what the author describes (based to a large extent on pure speculation) would be a profitable business model. It's one thing to keep huge amounts of inventory in centralized locations and ship items out across the country over the course of days. It's quite another to stock all that inventory redundantly throughout every state. So if I want to buy a specific pair of shoes that I can't find at Foot Locker, is Amazon going to stock 50 pairs on the off-chance that somebody down the street is going to buy them? Probably not.

In fact, the USPS runs a similar business model. One would then logically think (if this article is any indicator) that UPS and FedEx would go belly-up. I mean after all, if I can distribute mail to somebody's house for $0.50 in a day, how could UPS and FedEx compete with that? The reality is just the opposite. UPS and FedEx enjoy such enormous margins when compared to the USPS, that they want no part of the markets in which the USPS operates and it's too late for the USPS to compete with UPS and FedEx on parcels and important packages. I don't agree with the author's prediction at all.

bennesvig 3 days ago 5 replies      
The only advantage local retail stores have is immediacy. You can buy it as fast as you can drive there (and park/find it in the store/wait in line). Amazon has almost every other advantage. Ive come to find shopping local retail stores more and more frustrating. Shopping without reviews or videos, and having to flag down employees to help you locate items. It's really hard to beat online shopping with one day or two day shipping. Not impossible, but challenging. Target, Best Buy, and other generic mass retailers would be hurt the most
ktr 3 days ago 1 reply      
This is crazy - I ordered 3 things from Amazon today through Prime and started wondering if/when a day would come when you'd have same day delivery from Amazon and what it would look like. I figured it would happen someday, but thought the complexities would be too much to handle for a while. Looks like they're way ahead of me. This is why I love Amazon.
ori_b 3 days ago 0 replies      
When Amazon lets me physically try out the feel of things while browsing them, see how well built a tool is, or how a piece of clothing fits, it might have a chance of killing local retail.

Until then, I like to see which pants fit best, which knife is most comfortable in my hand, which tablet is most responsive. Yes, if I already know what I want, I'll go to Amazon. But if I'm not entirely sure and I want to compare things, I'll go to a local store.

peppertree 3 days ago 1 reply      
Amazon is playing chess while the brick and mortar stores are playing checkers.
learc83 3 days ago 3 replies      
The other day I ordered something from Amazon, and it showed up in a small cargo van, delivered by a guy without a uniform, and in a package with no UPS or FEDEX label (or any other delivery label I could discern apart from the amazon labels).

Was this some kind of test for Amazon ran delivery from a local warehouse?

sageikosa 2 days ago 0 replies      
At long last, the dreams of the Coyote ordering from Acme and getting immediate delivery of his latest gadget to catch the Roadrunner come closer to fruition.
jusben1369 3 days ago 1 reply      
Remember when home videos were going to obliterate movie cinemas? Who would go out when they can watch a movie on their couch?! For a while too things dipped. Then people realized there was a social element to going to the movies that made it a compelling experience. Add to that iMax and 3D etc. Heck, who would bother going to an Apple store when you can buy everything online!

Amazon is to shopping what McDonalds is to food. We all know what happens when you have McDonalds every day.

k-mcgrady 2 days ago 0 replies      
I don't think it will destroy local retail. All the businesses within a 10 minute walk of your home will be fine, it's still more convenient than same day shipping unless your very busy. It's the businesses slightly outside the core of the town that will suffer. If it takes you 30-60 minutes to drive to a store you might order same-day from Amazon.
mseebach 3 days ago 0 replies      
> Physical retailers have long argued that once Amazon plays fairly on taxes, the company wouldn't look like such a great deal to most consumers.

I love the audacity of "If you can't beat them, find some other way to beat them."

damian2000 2 days ago 1 reply      
Here's the original Financial Times article that this one was mostly taken from (nb: this link avoids their paywall via google's url redirection):


zeroonetwothree 2 days ago 0 replies      
Workplaces should be an easy place to get same day delivery. At a large office you probably have 100+ Amazon orders each day that conveniently get delivered by the company's staff. Amazon could have their own truck just drive straight to the office building without getting UPS involved at all.
adventureful 3 days ago 0 replies      
I thought Walmart already destroyed traditional local retail?

This one goes in the scaremonger bag.

jonhendry 2 days ago 0 replies      
Local retail is their own worst enemy, at least those that have websites.

If I go to the website of a brick and mortar store, chances are I want to know what they carry in-store, because I need something specific and am planning to visit a store.

Ideally I'd be able to see what they have in stock at the local stores, so I can avoid a needless trip.

Instead, many retailers have larded up their online stores with online-only products. Some don't even give you an easy way to exclude those and see only in-store items.

This is stupid, and just makes me more likely to not bother visiting the store at all, opting to just order from Amazon.

grandalf 3 days ago 0 replies      
I already don't shop locally except on a whim and when there are items I'm not price sensitive about at all. Well, groceries are an exception but I like to pick my own produce.
webjunkie 2 days ago 0 replies      
In Germany, when you order in the morning before noon, you will most likely get your stuff by the next morning without any special shipping at all. That's how efficient the German postal service is :)
russell 2 days ago 0 replies      
I actually have a problem with Amazon's shipping. I live in a small town. Often FedEx shipments are dropped off at the USPS, so a two day delivery turns into three or more days. The worst was 5 days. UPS OTOH often delivers the next day.
hef19898 2 days ago 0 replies      
The biggest challenge I see for amazon to come is, besides competition and makets and all that their ability to handle their whole supply chain. Amazon already proved that in terms of warehousing and distribution they are really ggod. but if there is one thing that comes with an ever increasing number of arehouses, especially if you want same-day delivery, is a ever increasing inventory level, in both number of items and value.

So from the outside one cornerstone of a strategy like this would be inventory management at at least regional if not even warehouse level. It's do-able, no question, but difficult. An advantage amazon has here is a huge history of point-of-sales data for most important regions they are operating in, no matter if there actualy is a amzon warehouse or not. And if they don't have (enough) point of sales data, well then it's to early to offer same-day service and all that.

All amazon has to do now is to keep operations up to the task... :-)

Shivetya 2 days ago 0 replies      
I remember my grandmother telling us how first the bus and then the family car let her shop where she wanted to. Stores adapted. Face it, lack of transportation will allow bad service to exist simply because the customers are trapped.

When your the only game in town, well.

So I don't buy the dire predictions.

S201 3 days ago 2 replies      
Looks like Amazon is finally bringing Webvan to fruition. Only took 11 years.
jsavimbi 3 days ago 0 replies      
I don't shop local any more unless it's for a premium item and the customer service is beyond average. Not when Amazon can deliver consumer items the next day.
ilaksh 2 days ago 0 replies      
OK so pretty soon Amazon will have same day delivery. That's going to be great!

Then, can we have evacuated tubes that connect our homes to all of the retail outlets, or homes to homes maybe also, and then they could have like 5 minute delivery, or however long it takes the picking robot to get it and then for it to travel at hypersonic speed through the tube?

Maybe we should all live within a few hundred yards of the warehouses. Then upgrade the picking robots and streamline the warehouses and tubes so they can deliver items in less than a minute.

Then we can all have robots that pull the items out of the tube, open the packaging and hand it to us on our couches/beds.

I would totally buy a giant plastic jug of cheeseballs right now if it could arrive in less than a minute and be delivered to my hands.

hrktb 2 days ago 0 replies      
I'd have died to have that in the US or Japan for e.g., anywhere delivery of basic goods actually works and isn't painful. I feel the main barrier to this is to have the local delivery company deliver things on time and in a friendly fashion.

That's not something you could expect in any country amazon operates (french's chronopost delivery is really hostile for ex., but there should be a ton of others). It'd bear with it for things I expect in weeks anyway, it would be horrible to have it for goods that are supposed to be there today.

ams6110 2 days ago 0 replies      
I don't see how they can possibly achieve the same economies of scale with a lot of small local distribution centers vs. a few huge ones. Plus coordinating deliveries in all those local markets vs. just having UPS or FedEx handle that part. To a point this may be successful, but it sounds self-limiting to me.
zackmorris 2 days ago 0 replies      
I thought of this November 3, 2010 :-)


baak 2 days ago 0 replies      
To be honest, I feel this has a good outcome for environmentalism as well. Same day delivery putting physical retailers out of business means less overall driving.
tmuir 3 days ago 0 replies      
In the chicago area, you can order from McMaster Carr (mechanical hardware), and get same day delivery. Any company that builds anything mechanical is probably ordering something from McMaster Carr. This isn't an impossible problem.
parka 2 days ago 0 replies      
I hope this applies to international shipping as well.

I buy stuff regularly from Amazon and their prices are even cheaper than what my country's local stores are offering after shipping included. E.g. A book from Amazon cost ~30% cheaper than the shops here in Singapore.

arjn 2 days ago 1 reply      
My opinion is that this may not be the end of local retail. I would still prefer to shop at my local grocers and farmers market on the weekend. Sometimes shopping is more than just shopping, its also an outing with family.
Scene_Cast2 3 days ago 1 reply      
There are a few fundamental, hard problems : energy and transportation. Amazon is trying to solve a subset of the latter here. However, I like to see/feel some things in person before buying. I often go through lot of items before finding "the one". Examples: food, pens, shoes. It would revolutionize the industry if Amazon could solve that. Until then, it's just a really large online store.
Also - USA isn't the world. Non-US Amazon is heavily sub-par to other online stores: worse prices, small selection. So there's that, too.
stephen272 2 days ago 0 replies      
First the internet revolutionized music and eliminated record stores
Then it revolutionized videos and took down video stores
It didn't stop there, and moved on to bookstores and wiped them out.
And now its taking out brick and mortar stores in general.
Honestly, I LOVE IT! The internet is so great. Anything that gets stuff in my hand faster or easier is great in my eyes.
dave5104 3 days ago 1 reply      
I'm surprised Amazon just hasn't started their own logistics company. With the spread of their warehouses now, is the next logical step becoming their own USPS/UPS/FedEx? I can only imagine what would happen once Amazon takes over the part of the process that seems to always have the most problems.
emperorcezar 3 days ago 1 reply      
A business is required to pay sales tax, not collect. Like some other businesses, Amazon can just include the tax in the price.

For instance, bars will include the taxes in the drink tax because if they show you how much tax is in a glass of beer you'd be very surprised, it's something like 40%.

It's an mental game though. Do you get people "through the door" with an item at $19.99 and hope they don't close the tab when they see the $1.00 tax, or do you just advertise it at $21.99 and hope that people are attracted to you because of name, etc even though joesonlineshop.com has it for $19.99?

wilki 2 days ago 1 reply      
I do wonder how much online retailers have saved by not dealing with shoplifting and the overhead involved with a fleet of loss-prevention employees.
kfury 3 days ago 0 replies      
TL;DR: "Amazon's going to wreck retail now that they're giving up their unfair advantage and relying solely on better service."

Boo hoo.

sanjiallblue 2 days ago 0 replies      
Once 3D printing tech gets off the ground, same-day delivery could end up becoming common or even the standard.
mariuolo 2 days ago 0 replies      
I can think of several local stores I would like to be erased from existence.
snambi 2 days ago 0 replies      
Amazon is trying to become a specialized shipping service?
ck2 3 days ago 2 replies      
What this is going to do is punish people in non-metro areas.

The way that happens is the same way supermarkets work. In very competitive areas, cut rate prices, coupons and promos like doubling, etc. are offered very liberally. To make up for that profit loss, they pump up the prices and cut out promotions in areas where there is little to no competition.

In areas with a walmart and target and now amazon local delivery, prices are going to be crazy good.

Everyone else will suffer as they supplement the profit-loss.

craze3 3 days ago 6 replies      
(Disclosure: Slate participates in Amazon Associates, an "affiliate" advertising plan that rewards websites for sending customers to the online store. This means that if you click on an Amazon link from Slate"including a link in this story"and you end up buying something, Amazon will send Slate a percentage of your final purchase price.)

Do they really think that they'll get more affiliate conversions by being the honest guys? Seriously, what is the point of this?

Vim Creep rudism.com
464 points by DavidChouinard  2 days ago   228 comments top 35
kevinalexbrown 2 days ago  replies      
This is an extremely awesome, fantastic, exhilarating use of hyperbole.

What I gained from the article wasn't any explanation of why Vim is great; indeed, there was little included in this respect, aside from moving "entire blocks of code with the flick of a finger." Even the cool feeling of proficiency you get from knowing a tool well, and the fun of getting to show off a skill to a "how did he do that!?" audience was secondary.

My favorite part was the implicit reminder to relax about my tools. The most effective use of hyperbole is not the emphasis of a particular point, but a caution not to take ourselves too seriously, and place our self-satisfaction in proper perspective. The most powerful instances border on satire, and in that respect, this piece was perfect.

Addendum: I use vim and pentadactyl. They are extremely awesome, fantastic, and exhilarating.

crazygringo 2 days ago  replies      
This feels like an article about a religion or cult, not a software program.

I just don't get it. Ever since newer editors got block editing or multiple insertion cursors, and RegEx find & replace across multiple files, and searching filenames to open... I feel like I've already got everything I need!

What am I missing out on? I don't feel like my text editor holds back my productivity. Using something like Sublime, I never think, man, if only Sublime did x, it would save me five minutes, twenty times a week!

I've never had anyone explain to me what specific kind of code editing is so much more productive in vim than in any other editor. Can someone give me a real-world, commonly occurring example?

Or is it not about productivity? Is it an interface thing? People like the way it feels to use? The article explains the "feeling" I always hear about, how vim is so much better, but for the millionth time, fails to tell me why, in a way a non-vim-user can understand.

jdludlow 2 days ago 2 replies      
My first week in college a distant relative of mine at the same school pulled me into the Sun lab. The sum-total of his instructions were:

* This is how you log in.

* This is ls.

* This is cd.

* This is man.

* This is apropos.

* Learn vi.

* Have fun.

What came from that 5 minute intro to Unix has been applied orders of magnitude more often than anything from 5 years of an EE degree.

SwellJoe 2 days ago 4 replies      
I've said it before, and I'll say it again...

Editing text is a solved problem: vim or emacs. Pick one and get back to work.

karmajunkie 2 days ago 3 replies      
Can we all stop patting ourselves on the back for being vim badasses now?

Yes, its a fantastic editor.
Yes, its probably going to make you more productive.
Yes, its every bit as capable as any other IDE or editor out there.

No, it will not make you a better person.
No, mastery of vim does not make you fart sunshine.
No, it won't make you an enlightened buddha.

moron 2 days ago 0 replies      
I still can't believe people love that damn thing so much, even after all these years. There are way better tools in my opinion.

Of course, if you like it by all means continue using it. But if you think it has anything to do with your talent as an engineer, you are wrong. And if you see other engineers not using it and think it implies anything about their talent, you are wrong.

Side note: if anyone wrote in such a breathless tone about Apple, they would be skewered as a fanboy and a drinker of the Kool-Aid.

AndyKelley 2 days ago 0 replies      
Somehow this reminds me of an Everything2 article about Nethack titled "You have a sad feeling for a moment, and then it passes."

It inspires the same kind of nostalgia.


kstenerud 2 days ago 0 replies      
I started using vi in high school. Used it extensively throughout college. Then I had my first taste of a proper GUI IDE.

Since then, I've only used vim for editing config files on a remote unix system.

kator 2 days ago 0 replies      
LOL so in my early days it was "vi" but I too use "vim" now a days.

Either way the funny thing is when someone asks me "how do you do THAT in vi?" I often find myself typing in the air to replay the muscle memory that executes those instructions.

For better or for worse I've become muscle programmed to execute vi commands in flashes of thought that really never reach my normal cognitive functions. I love to watch people stunned as they see me do something I've taken for granted for years or figured everyone else knew how to do.

All that said, all tools are not for all situations and all people. I do at times find myself jumping into eclipse (Java I'm looking at you!!!) because of the many additional features available in a good IDE.

I know you can run the vim extensions in eclipse but I've never bothered. I figure if I'm going to mouse my way through an IDE I'm not sure what having vi commands will do to make my life easier. :-)

lisper 2 days ago 0 replies      
You could do a global search and replace of "vim" with "emacs" (or pretty much any other development tool for that matter) and it would not change the semantic content of this article at all.
Xcelerate 2 days ago 1 reply      
You know, I'm doing an experiment like this with Tetris. I was looking at the world record for 40 line clear (20.55 seconds at http://www.youtube.com/watch?v=WzUUKKuye24) and I realized that the limit is not how fast he thinks, but how fast he is pressing the keys.

So I designed my own system. One key press drops the piece instantly into the correct spot, with the correct rotation. It's making my brain hurt practicing because it requires so much thinking, but once I get it into muscle memory, things are going to get really interesting. My 40 line best using the arrow keys is 45 seconds and with my new scheme it is 2:21. But I'm dropping time quickly and I'm sure very soon I'll beat my record.

twism 2 days ago 1 reply      
looks like he forgot to switch to command mode at the end of the post.
jamesaguilar 2 days ago 1 reply      
I've never found that symbols -> buffer is the bottleneck for me. Instead, it exists somewhere between problem -> symbols.
kabdib 1 day ago 0 replies      
I'm like that, exactly, but with Emacs.

A long time ago I narrowly escaped being like that exactly, but with TECO :-)

I've learned not to fret about people's choice of editor. FOr instance, if someone was happy in NotePad, as long as it didn't affect me, I learned not to care. (I might grit my teeth in frustration, watching them flail for minutes at something I could do in seconds with my choice of editor, but that's my problem).

habitmelon 2 days ago 1 reply      
I have never been moved so close to tears by an article about a text editor before.
VMG 2 days ago 2 replies      
Nope. Don't get me wrong, I love vim, but I don't need to have it everywhere. It makes no sense in the browser. It's lacking some features Eclipse has (although if Eclipse had a proper Vim mode, I'd use it in a heartbeat).
gatordan 2 days ago 1 reply      
Not so subtly the author is telling the reader if you program with Vim you'll be just like "all the really great programmers" who can write 4 line solutions to problems that take ordinary programmers 10 pages. You'll produce "impossible patterns of code and text manipulation", and be able to "fill dozens of registers" (admittedly I don't know what that means). This is all bit silly and reminds me of "hacking" in the Hollywood films.

I understand that this post was tongue in cheek. I also understand that not all text editors are equal, if you see someone writing their first hello world program in Notepad yes you should tell them about much better alternatives. But this post confuses ability to program with ability to use Vim.

Being really good at using Vim will definitely let you say you're part of the club of people who are really good at using Vim. It just might make you more efficient at manipulating code. But let's not perpetuate the elitism that is pervasive enough in hacking, it's a text editor. It doesn't make you more intelligent, better at problem solving, or writing more efficient code.

lukejduncan 2 days ago 3 replies      
For me pragmatism wins the day. Vim has been installed on every machine I've ever ssh'ed into. Not true for emacs or even nano or pico.
kennethologist 2 days ago 7 replies      
This article has changed(or at least I hope) my life. I'm installing Vim today on my Windows Laptop. Does anyone know of a good guide for beginners?
mdonahoe 1 day ago 0 replies      
I chose vim because it was radically different than anything else.

All text editors have crazy shortcuts that do magic, but only vim has the ironically named NORMAL mode with its verbs, movements and registers...

I treated it like a new game to learn. Sometimes I will write a macro because it is more fun, even though it might not save me raw time.

ciderpunx 2 days ago 0 replies      
Nice. Very nice.

I even tweeted it. With TwitVim, naturally. http://vim.sourceforge.net/scripts/script.php?script_id=2204

devin 1 day ago 1 reply      
"Knife skills" don't make you a good developer. Cook a great steak and worry less about cutting it. Tools are boring. People are far more important. A post like this is akin to "I LOVE BRITNEY SPEARS." Please God, make the programmer pop culture stop.
jordo37 1 day ago 0 replies      
Fully recognize I am outing myself here but the timing seems great given the Vim articles that have been coming up lately, as well as my Lead Engineer keeps bugging me to really get into Vim. I have used it before, but never GOT it. Is there an online guide that is great (and not just a story like this one) or do I need to get the O'Reily Vi and Vim book?
pacomerh 2 days ago 0 replies      
Nice writing, this goes straight to the HN monthly. Ultra nerdy nonetheless.
modarts 2 days ago 0 replies      
I admit that this is beautifully written, even though I strongly disagree with the premise.
chrisdotcode 2 days ago 1 reply      
Also feels good to have vim bindings inside of emacs using evil (http://www.emacswiki.org/emacs/Evil).
mikebell 2 days ago 0 replies      
I'm all for vim or whatever other editor you prefer.

But to somehow claim that one editor or another is going to make you a better programmer and make your 10 pages of crap code become 4 elegant lines is pretty nonsensical.

No matter what editor you use; if you're not good at problem solving, you're not going to make a good programmer.

wangweij 1 day ago 0 replies      
If someone can always "churned out 4 line solutions for an assignment that took you 10 pages of code to complete", he can use any editor or maybe simply "cat >".
farinasa 2 days ago 0 replies      
Being somewhere in the middle of that timeline, I'd watch that movie. This piece reminds me that an authors vision is always more important to communication than the subject itself.
helpbygrace 2 days ago 0 replies      
I definitely save this article not for the awesome main post but for the useful comments and replies.

Thanks all.

thornofmight 2 days ago 2 replies      
How do people use Vi on large projects? Do you just manually type in the name of the file you're looking for every time?
jonzjia 2 days ago 0 replies      
macvim for objective-c? That sounds like a PITA to set up properly.
rsbrown 2 days ago 0 replies      
No, I didn't.
andere 2 days ago 0 replies      
What a feel-good piece.
m12r 2 days ago 0 replies      
why vim creep, it should be titled "ode to vim"
What are the most intellectually stimulating websites you know of? reddit.com
403 points by ColinWright  1 day ago   209 comments top 68
hluska 1 day ago  replies      
At the risk of being called obvious, I'm going to say, "Hacker News." I've never found a site that lets me tap into the brains of so many extremely smart people. This site has made me a much smarter person, dramatically improved my writing, and made me far better at what I do.


glhaynes 1 day ago 1 reply      
MetaFilter! http://metafilter.com/. Community blog with high standards ($5 registration fee helps this, I suspect) and lots of challenging, intelligent posts and commenters.

And its related site for questions, Ask MetaFilter: http://ask.metafilter.com/. From browsing it regularly I know [good, well-supported] answers to a thousand interesting and useful questions I'd never have even thought to ask.

kmfrk 1 day ago 7 replies      
If you want a secret tip to exploring a new world, download Papers (http://www.mekentosj.com/papers/), put all the interesting publications you find in one Dropbox folder, and import them. The annotation and note features - as well as the back-up option - makes this a really enjoyable way to publications.

It really is something to read papers that define the way we think, and it's a nice alternative to short blog posts and (pop) articles versus (pop) books.

Here are some papers to get you started:

\* 'I've Got Nothing to Hide' and Other Misunderstandings of Privacy: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=0998565

\* Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization: http://papers.ssrn.com/sol3/Papers.cfm?abstract_id=1450006

\* A technique for isolating differences between files: http://ejohn.org/projects/javascript-diff-algorithm/

\* A Future-Adaptable Password Scheme: http://static.usenix.org/events/usenix99/provos.html

Maybe this is cheating, maybe it isn't, but I definitely recommend it.


If you want to get political, some left-of-centre-leaning writers you can't go wrong with:

1. Matt Taibbi on Wall Street, Rolling Stone (http://www.rollingstone.com/politics/blogs/taibblog)

2. Frank Rich in American politics, now at New York Magazine (https://twitter.com/frankrichny)

3. Glenn Greenwald on civil liberties and foreign policy, in moderation (http://www.salon.com/writer/glenn_greenwald/)

4. Juan Cole on the Middle East, in moderation (http://www.juancole.com/)

5. Lawyers, Guns, and Money (http://www.lawyersgunsmoneyblog.com/)

6. The New Yorker's long-form articles. (I find their blog posts to be really poor, by any standards.) Also, ditch the Malcolm Gladwell articles.

MikeCapone 1 day ago 1 reply      
My submission would be for:


A lot of the main sequences of material (http://wiki.lesswrong.com/wiki/Sequences) there were written by a Hacker News member, Eliezer Yudkowsky. But it's also a great living community (like here), with new stuff constantly being added.

impendia 1 day ago  replies      
How about the Economist (http://www.economist.com)? By far the most intellectual and globally-oriented news website I'm aware of.
nick_urban 1 day ago 1 reply      
Arts and Letters Daily collects links to a variety of interesting articles, mostly from the humanities.


SatvikBeri 1 day ago 1 reply      
None. I love some websites like Hacker News, Quora, etc. for the knowledge content they give me, but none of them are nearly as intellectually stimulating as a book. There's something about focusing on a topic for hours at a time that leads me to understand it much better than if I spend the same amount of time in smaller chunks.
danboarder 1 day ago 1 reply      
http://marginalrevolution.com and http://kottke.org are both longtime favorites of mine, featuring interesting content ranging from economics to food to philosophy to art...
zerostar07 1 day ago 3 replies      
How did http://Edge.org miss the list

Is it just me or is that list a bit too shallow?

kristofferR 1 day ago 1 reply      
TheBrowser (http://thebrowser.com/) collects the best long form articles from around the web every day.

It's definitely worth checking out, it's one of my favorite web sites.

omaranto 2 hours ago 0 replies      
I mentioned MathOverflow [1] on that post and only one person upvoted it (I wouldn't be too surprised if it were someone I know or at least have heard of, now that I think about it).

[1] http://mathoverflow.net

aw3c2 1 day ago 2 replies      
http://projecteuler.net/ made me try so many different programming languages just for fun.
Samuel_Michon 3 hours ago 0 replies      
The Middle East Media Research Institute translates Arab and Persian news into English. It's fascinating how much it differs from news in the Western world, in style as well as content.


harrylove 1 day ago 1 reply      
msluyter 1 day ago 0 replies      
If you want a challenging economics blog, I'd vote for Marginal Revolution. Tyler Cowen writes on a wide range of interesting economic topics from a unique (libertarian oriented, but minimally ideological and empirically grounded) perspective.
joeld42 1 day ago 1 reply      
http://www.butdoesitfloat.com/ is my favorite site when im stuck for inspiration.
ovi256 1 day ago 0 replies      
Who else had an attack of elitist "Eternal September" panic and checked if HN was on that list ? I admit I did.
exue 1 day ago 0 replies      
Quora! The question and answer site: Here's a great example:

Question: Engineering Management: Why are software development task estimations regularly off by a factor of 2-3?


Exploring, reading, writing, upvoting, and commenting on great question and answers from all sorts of subjects, has helped me learn a lot. The people there are great too, with many lending expert knowledge you wouldn't see elsewhere.

patrickg 1 day ago 0 replies      
Nobody mentions wikipedia? I think this is one best sites for understanding historical things (amongst others of course).
codesuela 1 day ago 1 reply      
I'm shocked that no one has mentioned http://youarenotsosmart.com/

> The central theme of You Are Not So Smart is that you are unaware of how unaware you are. There is branch of psychology and an old-but-growing body of research with findings that suggest you have little idea why you act or think the way you do. Despite this, you continue to create narratives to explain your own feelings, thoughts, and behaviors, and these narratives " no matter how inaccurate " become the story of your life.

corford 1 day ago 0 replies      
One that seems to be missing so far: www.lettersofnote.com. Sometimes, it can be profoundly stimulating.
jeremyt 1 day ago 0 replies      
The posts are excruciatingly long, but fascinating and thought-provoking whether you agree or not:


sgrytoyr 1 day ago 0 replies      
I expected http://www.3quarksdaily.com/ to be somewhere near the top of that list, but it's nowhere to be found, which really surprised me. It's one of the most consistently interesting sites on the Internet, if you ask me.
edwardy20 1 day ago 0 replies      
Quora (http://quora.com) helps me learn new things every day in all subjects.
TeMPOraL 1 day ago 1 reply      
http://ted.com - the number and quality of ideas there still amazes me. Pretty much every talk can be a start of an interesting, constructive and deep conversation. Almost every video leaves me with a feeling of wonder. Culture at it's highest.

I'm pretty sure everyone on HN knows what TED is, but just in case someone doesn't, here are some of my favourite talks - they show the breadth of the topics covered:

- http://www.ted.com/talks/derek_sivers_weird_or_just_differen... - 2 minutes about how things we thing work in some way may be completely different somewhere else

- http://www.ted.com/talks/hans_rosling_and_the_magic_washing_... - how technology really improves lives

- http://www.ted.com/talks/lang/en/eythor_bender_demos_human_e... - exoskeletons, with wheelchair woman standing and walking live on the scene

- http://www.ted.com/talks/lang/en/sam_richards_a_radical_expe... - a talk about empathy

And of course, the obligatory one,

- http://www.ted.com/talks/lang/en/ken_robinson_says_schools_k... - how school kills creativity

b_emery 1 day ago 1 reply      
Here's a personal favorite: http://calnewport.com/blog/, which has been on HN before. The post interval (~2wks to a month) is perfect for to keep me thinking about deliberate practice, learning methods, and being 'too good to ignore'. It's fundamentally changed my outlook on what it means to be smart, and it has in fact encouraged me to pursue a PhD. So, yeah, it's pretty stimulating. Here's a favorite post:


intellegacy 1 day ago 3 replies      
This is slightly off-topic but I thought I'd ask HN's opinion on an intellectual website idea I have.

I want to call it "intellegacy", for "intellectual legacy"

I really enjoy reading insightful essays and comments on the internet, and thought it'd be cool to have a website with all kinds of intellectuals with their own pages on it where I could read their essays, see a list of their books, and have conversations with them.

I envision scientists, artists, politicians all interacting on the site. For example, Neil de Grasse Tyson could post his blogs or essays there, and fans of his could get a summary of all his work, books, and what he's currently working on or reading.

Besides the goal of intellectuals having their own space to publish their insights, I also want the public to be able to read and learn on a clearly organized website, by taking their time. Maybe this isn't good for pageviews, but on other websites the content refreshes so quickly that a lot of insights are lost in the shuffle.

The grand vision is to be a huge library of insights, clearly organized and that can be read by anyone who wishes to learn and follow the thought leaders in our world.

We could "best-of" the best debates and discussions and future generations could read everything.

vijayr 1 day ago 0 replies      
someone collected most links mentioned in the comments
ThomPete 19 hours ago 0 replies      
All these are great, but nothing beats http://www.spacecollective.org

Just to give you an example is this book recommendation post


Warning: There is some weird flash thing in the very beginning but just click to continue.

BlackNapoleon 7 hours ago 0 replies      
The Last Psychiatrist: http://thelastpsychiatrist.com/
ajays 1 day ago 2 replies      
Here are some others that are interesting:

Vitamin Cr : http://vitamincr.com

Brain pickings: http://www.brainpickings.org/

Synaptic Stimuli: http://synapticstimuli.com/

Timoni's blog: http://blog.timoni.org/

nacker 16 hours ago 0 replies      
Intellectual stimulation is not always a pleasant experience, which tends to make people susceptible to restricting their reading to material with which they broadly agree.

For most HN readers, I'd suggest visiting a site like http://www.vdare.com

markkat 1 day ago 0 replies      
http://hubski.com/ You choose who curates content for you. Posts are shared rather than voted on.
primodemus 1 day ago 0 replies      
Both are full of interesting ideas:



justinhj 1 day ago 0 replies      
I agree that hn fits the bill well. It is quite limited in the diversity of topics and type of people, by design.

I've found I also get good mental stimulation by picking free online courses at random. For example Yale University has a brilliant set of lectures on Milton, which I knew very little about prior.

TylerE 1 day ago 0 replies      
May not quite as high-brow as much of what i've seen listed, but:

Sporcle: http://www.sporcle.com

Scene_Cast2 1 day ago 0 replies      
My suggestion isn't quite going to be a link, but here it goes. Get a hobby. Programming, sport, (specific) art, etc. There is a myriad of niche communities for all sorts of things, and I would argue that the more in-depth creative ones are more stimulating than any of the "general-purpose" ones listed here.

My "intellectual stimulation" consists of coming up with or doing things. Sure, self-reflection helps, but more often than not, they're inspired by something I see in a "specialized" community.

Slightly off-topic, but asking "why" helps with self-reflection and creativity.

HarshaThota 1 day ago 2 replies      
Not a website per se, but AskScience (http://www.reddit.com/r/askscience) is an amazing place for various science discussions.
halvsjur 1 day ago 0 replies      
The fifth site on the list (https://www.ifeveryoneknew.com/) of the guy that started that thread is pretty interesting.

The five points listed there are well known and well documented. Those are not crazy unfounded theories.

Yet most people I meet do not believe any of them. Why is that? Cognitive dissonance?

mahmud 1 day ago 0 replies      
my favorite news aggregator: scholar.google.com
metaphorical 1 day ago 0 replies      
http://butdoesitfloat.com for visual and conceptual thinking.
scottjad 1 day ago 0 replies      
http://mises.org economics and politics) and http://volokh.com (law
kul 17 hours ago 0 replies      
glaze 16 hours ago 0 replies      
lignuist 17 hours ago 0 replies      
SolarUpNote 1 day ago 0 replies      
The Long Now Foundation podcast


Maven911 11 hours ago 0 replies      
Slate.com n famous economists blogs
Spittie 1 day ago 0 replies      
I really like BetterExplained (http://betterexplained.com/).
It made me a bit more interested in math, which i never liked.

I also love the whole StackExchange network, especially skeptics (http://skeptics.stackexchange.com). A great way to learn new stuff from common myth.

taofu 12 hours ago 0 replies      
Hacker News. Simply because it just gifted me this thread, full of intellectual goodness.
stevencorona 1 day ago 1 reply      
http://lumosity.com for keeping your mind sharp (costs money, free to test for 3 days)
platz 1 day ago 0 replies      
Casting my vote for http://www.ribbonfarm.com
known 17 hours ago 0 replies      
jameszol 1 day ago 0 replies      
I recommend this as a starting point for finding intellectually stimulating websites: http://www.reddit.com/r/reddit.com/comments/cktxy/reddit_let...
skbohra123 1 day ago 0 replies      
Thank you, this has to be one of the best thread ever on HN. Good to find interesting stuff other than HN
dmoo 7 hours ago 0 replies      
www.Thersa.org I particularly like the animate items
nova 1 day ago 0 replies      
htt://www.lesswrong.com (it was better a few years ago)
ececconi 1 day ago 0 replies      
How is Quora not on there?
ybother 1 day ago 0 replies      
reddit is still a great site once you remove the default subreddits and find some of the more active but smaller subreddits. /r/truereddit regularly has in depth articles that are still accessible to a general audience.

Science subreddits with a cognitive barrier to entry like /r/neuro are a great source of news specific their scientific communities. Geographic subreddits such as /r/[yourmetro] are also a great way to keep in touch with the general vibe and events of your city.

comatose_kid 1 day ago 0 replies      
mirceagoia 1 day ago 0 replies      
readymade 1 day ago 0 replies      
Hacker News! (just kidding)
Buzaga 1 day ago 2 replies      
www.cracked.com is one for me, seriously
Scaling lessons learned at Dropbox, part 1 eranki.tumblr.com
392 points by eranki  2 days ago   79 comments top 22
dools 1 day ago 4 replies      
but I really hate ORM's and this was just a giant nuisance to deal with

I like object relational mapping as a theory (ie. I have an object of type Author which has 1 or more books I can loop over), but I hate ActiveRecord implementations. Eventually, they just end up implementing almost all of SQL but in some arcane bullshit syntax or sequence of method calls that you have to spend a bunch of time learning.

I also seriously doubt that anyone has ever written a production system of any reasonable complexity and been able to use the exact same ORM code with absolutely any backend (if you have an example please correct me on this). This barely even works with something like PDO in PHP which is a bare bones abstraction across multiple SQL backends.

When it comes down to it, the benefits of ActiveRecord are all but dead on about the third day of development. The data mapper pattern adopted by SQLAlchemy (et. al.) takes all of the shitness of ActiveRecord and adds mind bending complexity to it.

SQL is easy to learn and very expressive. Why try and abstract it?

I spent years working with an ActiveRecord ORM I wrote myself in my feckless youth and thought that it was the answer to the world's problems. I didn't really understand why it was so terrible until I did a large project in Django and had to use someone else's ORM.

When I really analysed it, there were only three things that I really wanted out of an ORM:

1) Make the task of writing complex join statements a bit less tedious

2) Make the task of writing a sub-set of very basic where clauses slightly less tedious

3) Obviate the need for me to detect primary key changes when iterating over a joined result set to detect changes in an object (for example, looping over a list of Authors and their Books)

To that end, I wrote this:


It's written in PHP because I like and use PHP but it's a very simple pattern that I would like to see elaborated upon/taken to other languages as I think it provides just the bare minimum amount of functionality to give some real productivity gains without creating a steep learning curve, performance trade-off or any barrier to just writing out SQL statements if that's the fastest way to solve the problem at hand.

brc 2 days ago 5 replies      
The idea of running extra load - it sounds good in theory but I can't help thinking that it's a bit like setting your watch forwards to try and stop being late for things. Eventually you know your watch is 5 minutes fast so start compensating for it. I wonder if this strategy starts to have the same effect - putting fixes off because you know you can pull the extra load before it becomes critical. In the same way you leave for the train a couple of minutes later because you know your watch is actually running fast.
nl 2 days ago 2 replies      
I wish he'd left the security advice out.

The whole post was excellent, but all the useful points will now be overshadowed by the armchair quarterbacking about security by people who mostly don't understand that ALL security is a compromise, and it is as important to understand and make deliberate decisions about your security as it is to try to make a secure system in the first place.

jgannonjr 2 days ago 2 replies      
Great post, but this part scares me a bit...

I think a lot of services (even banks!) have serious security problems and seem to be able to weather a small PR storm. So figure it out if it really is important to you (are you worth hacking? do you actually care if you're hacked? is it worth the engineering or product cost?) before you go and lock down everything.

Just because you can "afford" to be hacked, doesn't mean you shouldn't take all the steps necessary to proactively protect your data. In the end, security is not about you, it is about your users. This is exactly the type of attitude that leads to all the massive breaches we have been seeing recently. Sure your company is "hurt" with bad PR, but really your users are the ones who are the real victims. You should consider their risk (especially with something as sensitive as people's files!) before you consider your own company's well being.

Edit: formatting

akent 2 days ago 4 replies      
I noticed that a particular “FUUUCCKKKKKasdjkfnff” wasn't getting printed where it should have

Why not take the extra half a second to make those random strings meaningful and hidden behind a DEBUG log level?

acslater00 2 days ago 2 replies      
For the record, I use sqlalchemy 0.6.6 regularly under fairly heavy load, and have never had a problem with it. Any 'sqlalchemy bugs' are inevitably coding mistakes on my part.
prayag 2 days ago 2 replies      
Fabulous post. Thanks for writing.

One point it misses though is to test your backup strategy often. When you scale fast things break very often and it's good to be in practice of restoring from backups every now and then.

crazygringo 1 day ago 0 replies      
> I noticed that a particular “FUUUCCKKKKKasdjkfnff” wasn't getting printed where it should have


I've never seen a shorter description of real-world software development. That's it in a nutshell!

ivankirigin 2 days ago 1 reply      
Rajiv is awesome, you should listen to him
misiti3780 2 days ago 0 replies      
Great advice:

"pick lightweight things that are known to work and see a lot of use outside your company, or else be prepared to become the “primary contributor” to the project."

wulczer 1 day ago 1 reply      
Great article! Small nitpick from someone who just tried this on his server logs :)

  * on my machine xargs -I implies -L1, so you can drop that
* use gnuplot -p or the graphic will disappear immediately after rendering

elefont2 1 day ago 0 replies      
'Even memcached, which is the conceptually simplest of these technologies and used by so many other companies, had some REALLY nasty memory corruption bugs we had to deal with, so I shudder to think about using stuff that's newer and more complicated'

Does anyone know what memory corruption bugs they are referring to?

lobster_johnson 1 day ago 0 replies      
I'm surprised that Dropbox actually uses S3 internally to store data. All along I had assumed, wrongly, that Dropbox had built their own distributed storage cluster.
opminion 2 days ago 0 replies      
A topic usually left out in scaling discussions is: how much can one predict? Or is it mostly trial and error? Is it mostly about good "reactive" engineering, would it have benefited from good mathematical modeling?
JohnGB 1 day ago 1 reply      
I believe that the section on "The security-convenience tradeoff" is fundamentally flawed.

A username and password represent a pair. Neither one has meaning in terms of authentication without the other.

Take the example where I have forgotten my username (JohnGB), but try with what I think it is (Say JohnB), and enter the correct password for my actual username. The system would then tell me that my username is fine, but that my password isn't. From then on, I would be trying to reset the password for a different user as the system has already told me that my username was correct.

Please, for the sake of sane UX, don't do this!

gallerytungsten 2 days ago 0 replies      
Great article. Rajiv made it easy to understand the conceptual framework. The lesson is: always strive to be robust. Test your failure points deliberately. Applicable to more than just server scaling.
anamax 1 day ago 0 replies      
There's a talk about Dropbox scaling at http://www.stanford.edu/class/ee380/winter-schedule-20112012... .
mistercow 1 day ago 0 replies      
Running with extra load seems inefficient in terms of energy consumption. Would it be possible to achieve the same thing by inserting delays or something that can be turned off?
kevinburke 1 day ago 2 replies      

    MySQL has a huge network of support and we were 
pretty sure if we had a problem, Google, Yahoo,
or Facebook would have to deal with it and patch
it before we did. :)

I am fairly certain Google is running its own (patched) version that's fairly different than the off-the-shelf MySQL.

philfreo 2 days ago 2 replies      
Can you explain the nginx/HAproxy config a little more?
matt 2 days ago 0 replies      
Nice, love the idea of running with extra load to predict breaking points.
stratos2 2 days ago 0 replies      
all security is a balancing act which is the point he is making. there is always a tradeoff
UK anti-encryption law falkvinge.net
342 points by timf  2 days ago   188 comments top 31
nathan_long 2 days ago  replies      
His argument is: 1) They can lock you up for refusing to decrypt something. 2) Encrypted data looks exactly like random noise. 3) Encrypted data can be hidden in any file. 4) Therefore, they can allege that nearly anything is encrypted and lock you up on that basis.

I'd say that's terrifying.

Another thought: doesn't this make it possible to frame someone by writing random data to their hard drive?

16s 2 days ago 2 replies      
It is impossible to prove a PRNG'ed file is or is not encrypted data. TrueCrypt volumes look identical to `dd if=/dev/urandom of=file.bin bs=512`. Create a few of each and then evaluate them using ent to see this for yourself.

Edit: Link to ent http://www.fourmilab.ch/random/

You could prove the file is encrypted if it is indeed encrypted and you have the passphrase and the program to decrypt it, but outside of that, it's simply not possible to say with any level of confidence that the bits are really encrypted.

BTW, I wrote TCHunt in 2007, a program that attempts to seek out encrypted TrueCrypt volumes and I have a FAQ that covers much of this. Here's the link for anyone interested in reading more about it: http://16s.us/TCHunt/

And, there is usually much more to it than randomish bits in a file on a disk. The government agents usually have other evidence that suggests the person in question is doing illegal things and may have cause to use encryption. Finding actual encrypted data is normally just icing on the cake to them.

mootothemax 2 days ago 1 reply      
In the section of the act mentioned (Regulation of Investigatory Powers Act 2000, part III), two of the defined terms are:

“key”, in relation to any electronic data, means any key, code, password, algorithm or other data the use of which (with or without other keys)"

(a)allows access to the electronic data, or

(b)facilitates the putting of the data into an intelligible form;

-- and --

“protected information” means any electronic data which, without the key to the data"

(a)cannot, or cannot readily, be accessed, or

(b)cannot, or cannot readily, be put into an intelligible form;


At first, I thought the argument in this article was nonsense. However, whilst I'd hope common sense would prevail, the definitions above seem broad enough that a policeman could make one's life difficult for a while.

shill 2 days ago 0 replies      
Every digital storage device on earth should contain a randomly sized random data file called RANDOM-DATA. The user of said device could optionally replace this file with encrypted data. Once critical mass is achieved, states that do not respect individual liberty would have no way of determining the nature of every RANDOM-DATA file that they obtain by eavesdropping, theft or force.

I know the answer to this is 'easier said that done'. Certainly hardware and OS vendors can't be trusted with this task. Maybe FOSS installers could educate users and optionally create the file? How can we make this happen? I want to wear a t-shirt that says 'random numbers save lives.'

SEMW 2 days ago 2 replies      
While it is obviously a bad law, it's not quite as bad as he's making out.


"For the purposes of this section a person shall be taken to have shown that he was not in possession of a key to protected information at a particular time if"

(a) sufficient evidence of that fact is adduced to raise an issue with respect to it; and

(b) the contrary is not proved beyond a reasonable doubt."

In other words, if there's evidence for there to be 'an issue' about whether you actually do have a key (or whether e.g. it's just random noise), it's up to the prosecution to prove beyond reasonable doubt that it is actually data, and you do have the key.

So the flowchart is:

- If the police can prove they have reasonable grounds to believe that something is encrypted data that you have the key to, then

- That raises an evidential presumption that you do have it, which you can rebut by

- adducing evidence that just has to raise an issue about whether you have a key (inc. whether it's encrypted data at all), in which case the police have to

- Prove beyond reasonable doubt that it is encrypted, and you do have the key.


jakeonthemove 2 days ago 3 replies      
Damn, the UK is pretty f'ed up - the list of things that British citizens can't enjoy compared to a lot of other countries (even developing ones) is growing every day.

Meanwhile, a criminal could easily just store everything on an encrypted microSD card, then eat it if anything goes wrong - the oldest trick in the book still works in the digital age :-D...

shocks 2 days ago 1 reply      
Hidden volumes.

Volume one contains hardcore porn, volume two contains bank job plans. Neither can be proved to exist with their keys.

When asked, hand over the porn keys. Plausible deniability.

freehunter 2 days ago 8 replies      
I have to wonder if this would ever hold up in court. I don't know much about the UK justice system, but in America it would be pretty rare to be convicted of a crime that they can't actually prove you committed. You could be jailed for refusing to comply with a court order to decrypt the file, but if you can prove it's not actually encrypted, they can't do anything about it.
yason 1 day ago 0 replies      
The difference with programmers/scientists/hackers and politicians/authorities/lawyers is that the former see instantly where seemingly small changes in laws and policies will ultimately lead whereas the latter will dismiss these potential problems by making remarks such as "It will only be used against bad guys", which translates to "We had a few hairy cases where this sort of law would have really helped, so we wrote one to cover similar circumstances in the future and while we don't really know how to think of what else goes out with the bathwater we will need something at our disposal."
mistercow 2 days ago 1 reply      
>Yes, this is where the hairs rise on our arms: if you have a recorded file with radio noise from the local telescope that you use for generation of random numbers, and the police asks you to produce the decryption key to show them the three documents inside the encrypted container that your radio noise looks like, you will be sent to jail for up to five years for your inability to produce the imagined documents.

Of course, if you have access to the files, you could just XOR the noise with some innocuous documents, and send the result to the police saying it's a one-time-pad.

theaeolist 2 days ago 2 replies      
Isn't TrueCrypt's 'hidden volume' feature enough to make this law pointless? Just have two encoded sets of information in the same file. When you are asked to give the key it is up to you the key of which one you give.


MRonney 2 days ago 1 reply      
I was watching 'Garrow's Law' yesterday. He said that "Laws which are passed in times of fear, are rarely removed from the statute books". Terrorists always win, because every time they attempt to strike the Government removes our basic liberties under the guise of protecting us.
prsutherland 2 days ago 0 replies      
Encryption isn't just about hiding your documents. It is also about securing your assets and providing identification.

- The passwords on your bitcoin wallet give you the authority to spend your money.

- Your encrypted signature requires your private key so other's know your message came from you.

So, this law gives the government the ability to impersonate you and consume/use your assets in an unrecoverable way.

While the government might not have the authority to impersonate you or spend your money, they do have the authority to acquire the means to do so. And then all it takes is one dishonest person working for the government to use that information maliciously.

alan_cx 1 day ago 0 replies      
Please forgive my technical ignorance, but can an encrypted cookie be dropped in to my browser cache by a web site? Could an encrypted image with hidden information on a web site end up in my cache? If so, millions of people could have terrorist data in their caches and never know, nor have the key to decrypt it. Also, who has that file Wikileaks published as "insurance". Any one got the key? Any one know whats in it?
yyyt 15 hours ago 0 replies      
This makes me wonder why Brits prefer to courageously make jokes at Putin's regime (with which I'm fine, they're deserved), instead of just going to the Big Ben palace and giving a boot to the same kind of governors sitting there.
vy8vWJlco 2 days ago 0 replies      
We are have begun to outlaw privacy. This is wrong. Speak up, while you still have a voice.


jiggy2011 2 days ago 1 reply      
Assuming this article is true (which I am pretty skeptical of, I live in the UK and never hear about people being jailed for not giving up an encryption key).

What would happen if there is encrypted data on your system but you didn't set the key yourself? For example DRM systems usually work by encrypting data and trying their best to make sure you never acquire the key.

Albuca 2 days ago 0 replies      
This reminds me of this American Case:


But on the whole, the whole article is scary and slightly unsettling. On the upside I dont live in the UK - But if we were to be traveling through the UK with our encrypted HardDrives, would we be targeted by the law?

switch007 2 days ago 0 replies      
It makes me really angry seeing protests about laws which have already passed! It seems to be lazy journalism - after Liberty et al have done the hard work while the bill passes through parliamentary stages, once it's passed, traditional media and others pick up on it and start complaining.

Prevention is better than ranting after it's set in stone.

zaroth 2 days ago 0 replies      
Can you say, "Who is John Galt?"

Eventually the preposterous laws drive those with mobility to simply leave. Follow that to it's logical conclusion; the UK will make it difficult to impossible to leave with your assets intact. Loss of privacy is a just a precursor to loss of private property altogether.

muyuu 1 day ago 0 replies      
I live in the UK and this is the first I hear about this. Interesting how seemingly important law passes so silently.
epo 2 days ago 0 replies      
This article is paranoid ill informed speculation, as are many of the Brit-bashing comments. The police have to show a judge they have good grounds to believe you are concealing evidence from them. Note also that if the powers that be are really determined to stitch you up then they will plant data on you, much simpler.
ivanmilles 2 days ago 1 reply      
So, now Random actually /is/ Resistance?
baby 2 days ago 0 replies      
A scary article that forgot already many "stupid" or "vague" laws exist and are never used or always used in the right context.
chris123 1 day ago 0 replies      
Welcome to the future (Orwell, Minority Report, Enemy of the State, Matrix, etc.).
antoinevg 2 days ago 1 reply      
Roll on dual encryption. One key renders a dissertation on kittens, the other renders the original clear-text. Next problem?
Feoj 2 days ago 0 replies      
How does/would this affect Freenet users? As far as I know, a Freenet user's 'deniability' claim comes from the idea that the user does not know the key to the encrypted content hosted on their machine.
short_circut 2 days ago 0 replies      
So does this imply that I could go to prison for having an executable file presuming I can't "decrypt" it back into its original source code?
rashomon 1 day ago 0 replies      
Anybody know where I can find a thermite-holding 5.25" bay?
Zenst 2 days ago 3 replies      
I stand by my argument that you can have a encryption key that is say 2000 characters long. Print it out 1 character per page and submit that in advance at your local police station, getting a receipt. You are then within the law.

Now question is - compression can be views as encryption. How does that pan out if you use a non-standard form of compression that does not require a key as the compression formula is the key in itself!

adamt 2 days ago 4 replies      
I don't like or support the legislation - but I think this is a bit of an over-reaction.

The law as I understand it says that if you've got data (and the context of the law is in focussed primarily on targeting terrorism, child-porn etc) that you've encrypted but refuse to give over the encryption keys to; then if the police then convince a judge that there is valuable evidence in the encrypted data, and you still refuse, then you could ultimately go to prison.

Is this really any different to a digital search warrant?

Sure this law, like many others, could be abused. But I don't see it as anything to get to wound up about.

P.s. what kind of person has a 32GB file of satellite noise to generate random numbers with?!

Still running strong after 1,532 days without a code change. ryandetzel.com
328 points by ryandetzel  1 day ago   126 comments top 35
DanielBMarkham 1 day ago  replies      
The first contract program I wrote was when I was 17 years old. I was working at a service station changing oil. Some guy comes through who wants to stand around while I do the work (it happens) so we struck up a conversation.

Turns out he was a bookkeeper and had just purchased an Apple IIe and wanted to use it for his clients. I knew nothing about accounting, he knew nothing about computers, so it seemed like a good match :)

Four weeks of spending free afternoons at his shop, and it was ready to go. He was happy and I had 200 bucks in my pocket. Life was good.

Almost 20 years later, I get a call from him. He says the program isn't working so well and he wants to upgrade. I'm like WTF? Does anybody in the universe still even have a working Apple II anymore? Why would he keep using something like that for 20 years?

He told me that as computers modernized, it became a bit of a status symbol to have an older-looking system spewing out reams of reports. His customers, who were mostly small construction companies and such, got the feeling of stability and security from something that was unchanged.

It is a very strange feeling to get a call about code you wrote a long, long time ago. If I would have had any sense, I would have realized from the experience that programming is normally an extremely tiny part of actually making a business work. But it took me many more years to figure that one out.

josscrowcroft 1 day ago 1 reply      
This is just great. I had a very similar experience with http://openexchangerates.org - hacked it together and left it running, and only realised just how popular it had become when, one morning 6 months later, I woke up to over 20 emails, all saying similar things to what the woman said to OP.

"What the hell! It's been down for a day already! When is it coming back?"

I was pretty bemused, so I checked the server logs and found that it had been hurtling along at what looked like 60,000 requests per day (I later found out it was closer to 200,000).

Needless to say I jumped in, fixed it, and got talking to the users who relied on the project for their businesses or personal projects - it's been a blast.

edit - it's since been completely rewritten, so I can't say the code went unchanged..

quink 1 day ago 3 replies      
> I set a reminder(thanks Siri) to check these stats later that night.

Thank you, alarm clock! Thank you, spoon! Thank you, hammer!

In any case, thank you for the post, and thank you even more for keeping this kind of service running for so long :)

dugmartin 1 day ago 1 reply      
Zombie projects are fun. I created this in 2006 and haven't changed any of the code since, other than the copyright date (in 2010 - I guess I should update that).


I didn't add Google Analytics until 2008 but here are the stats since then:

    Visits: 304,619
Unique Visitors: 246,368
Pageviews: 361,097
Pages / Visit: 1.19
Avg. Visit Duration: 00:00:41
Bounce Rate: 84.44%
% New Visits: 80.87%

I get one or two thank yous a year from people that get my email from my whois entry.

nicholassmith 1 day ago 1 reply      
The nice takeaway from it is that the code you wrote works perfectly well for a large number of people and is obviously stable and bug-free enough for them to use it for their businesses. Code isn't about hammering out new features or moving fast and breaking things.

And if it's written in Perl looking at the source code again might break it so I wouldn't do that.

forinti 1 day ago 3 replies      
10 years ago I wrote a sales system in Perl with a couple of friends. We charged something like US$750 (it was really simple: uses a hash instead of a database). It is still going strong (no bugs reported) and about US$500.000 has been sold with it. That's what I call ROI and stability!
albertsun 1 day ago 1 reply      
That's the story with the calendar app I use, http://30boxes.com/

I don't think a new feature has been added in about 5 years and it barely seems maintained, there have been a few day-longish episodes of downtime but other than that it works great.

It has all the features I need and am used to, and I don't have to worry about the product constantly changing. In a way it's almost better to use a neglected product.

rodolphoarruda 1 day ago 0 replies      
This is the living proof that the "web app market" is really stratified and the value perception within each customer group varies immensely. Who would tell that a 4 year old scope of features could keep a product alive? Who could possible agree that a system that supports thousands of accounts could survive without "hygienical" processes like OS/DB maintenance, bug fixing etc.? Who could imagine a product with a growing adoption/acceptance not having paid options included as plans to leverage revenue in the long run? Uff, this is all mind bogging to me. Congrats to the author though, this is an incredible story!
padraigk 1 day ago 0 replies      
Good article and a good reminder of how what is just a side project to the creator can be so important to its users.

It looks as if he's updated his forum software about as regularly as the site's main code because it's full of spam (http://forum.invoicejournal.com/). Might be worth fixing that as the consistent number of sign-ups and invoices suggests there is a community of users there to be fostered.

scribu 1 day ago 0 replies      
From my experience, people will use any free service or product, even if you explicitly say "this product is abandoned".
adrinavarro 1 day ago 1 reply      
Please don't sell it or break it down. If it isn't costly to maintain, then don't sell it for scrap as most people seem to want it. Close sign-ups if anything, but free users look like they're happy with it and after trying it, it works nicely. It's something easy to clone and it's not really something anyone would develop on top of, so rather than selling it for its user base (selling your users) plus the concept, open-source it. If anyone wants to copy the concept, it's easy to (and anyone buying it would need to write it from scratch if they want to make money out of it, anyway).

That gives me mixed feelings…

perfunctory 1 day ago 2 replies      
Agile software development must be a conspiracy theory. When a corporate IT manager keeps hearing "embrace change..." whisper all around him, he starts thinking that there must be something wrong with their piece of software that's been running just fine for years. I guess it's good for economy though.
emilw 1 day ago 2 replies      
After reading the entire article I noticed it doesn't contain a single link to the actual project. I like your style sir.
brianbreslin 1 day ago 0 replies      
I wrote a site on contract when I was 15 years old, exactly 14 years ago this summer. That site is STILL running. It was filemaker pro + Lasso on a mac server of some sort. http://mactalent.com I don't even want to know what kind of things are wrong with it today, but it made the guy who I built it for many thousands of dollars over the years.
ericcholis 1 day ago 2 replies      
Gives me a bit of a "ghost in the machine" feeling here. You created something interesting on a whim, gave it life, and sent it off into the world. Unbeknownst to you, it chugged on, somehow successfully urging people to use it. It did this mostly on its own.

I'm sure there's a white paper or talk on the life developers give their software...

jroseattle 1 day ago 0 replies      
While most people are looking for the next {x}-illion dollar opportunity (me included), the positive charge one gets when hearing that something they've created is actually useful to someone else is a great feeling.

Such a great story, makes me want to go fulfill those umpteen other ideas I've got hanging around to see what sticks. (I keep all these ridiculous domain names as kind of a skunkworks project task list.)

TamDenholm 1 day ago 0 replies      
Hey Ryan,

Id be very interested in taking this over and looking after it properly. Please shoot me an email and maybe we can talk about it.

Tam Denholm

nl 1 day ago 1 reply      
I think that is a great counterpoint to the story about the problems single founders have running 24/7 web apps.

It isn't clear though- is this a paid service?

dmor 1 day ago 0 replies      
Maybe you should keep it, not touch the code, and consider a new mission to do some marketing experiments with your 2 hours on the train every day... who knows what might happen
RobAley 1 day ago 0 replies      
If there is one thing that you do need to do, its turn off the support forum. Its full to the brim with spam, and is (probably) the only part of the site that actually does need continuous attention. Otherwise, congrats!
michaelfeathers 1 day ago 0 replies      
> Still running strong after 1,532 days without a code change.

Oddly, I think that's a great definition of "systems software."

rwhitman 1 day ago 0 replies      
I have 2 products in a similar-ish situation, both of which I don't even remember where the source repo is to make a bugfix even if I had to....

But after a few outages that I had no clue about until days had passed, I did learn to sign up for Pingdom so I can at least reboot apache within an hour of the site going down

pnathan 1 day ago 0 replies      
Hi Ryan, this sounds interesting. It sounds like something that could use some TLC and maintenance.

If you haven't committed to a decision with it, give me an email (it's in my HN profile).


keithpeter 1 day ago 0 replies      
"I had forgotten about it, I had neglected it, I had basically written it off but this soft voice reminded me that all though I had written it off she had not and that she actually relied on it to run her business."

Is this likely to become more of an issue as people move to Web based applications for things they used Office and file folders for before?

Mc_Big_G 1 day ago 0 replies      
Heh, one of the first web apps I wrote was around 2002 and it ran on an NT4 box without a modification or a reboot until this year.

I left the company 1.5 years ago and got a call from a supervisor asking me how to start it up again after a power outage and UPS failure.

scottlilly 1 day ago 0 replies      
Inspired by this post, I went to the website of a company I worked for back in 1999-2000. The login screen for the first "real" web app I ever wrote looks exactly like I left it.

The company was in a bind with some contractors who were developing a web app for them, and I had recently written my own personal HTML site. So, I bought "Teach Yourself ASP in 24 Hours", read it over the weekend, and put together a little prototype, to see if I could actually connect to the database and report the information they wanted. I ended up writing that website for them over the next two weeks. Once I deployed it (no IIS web admins at the company), I went back to my regular Clipper work.

I wouldn't be surprised if it hasn't been touched in years.

Part of me wants to be happy that it's lasted so long. But I'm also a bit disappointed that all the other 'cool' stuff I've worked on since then probably hasn't had one-tenth the use of that site.

taftster 1 day ago 0 replies      
From the blog post: "Now I'm wondering if I had continued to work on it, if I hadn't neglected it could it actually be a useful successful, dare I say profitable project?"

I'd argue it's likely successful just for the mere fact that he _hasn't_ touched it in four years. Had he actually tried to monetize the project, he probably would have just gotten in the way and screwed something up. This gives evidence to the "simple is more" rule.

powertower 1 day ago 1 reply      
website: http://www.invoicejournal.com/

It's actually not a bad template. To the point.

kristianp 1 day ago 0 replies      
It makes me a little sad that it hasn't made him any money. I imagine it might be good for the resume, though.
nashequilibrium 1 day ago 0 replies      
Thanks for post! I found this very inspiring including some of the comments, since it shows that solving a real need almost builds a business itself with grateful consumers. It also does not require the latest and flashiest software tools. I wish there was a blog that reported these little small projects that are basic and solve a real need without facebook logins flying around and asking me to import my address book.
mgurlitz 1 day ago 0 replies      
The site is back up. Google has a cache now in case it goes down again: https://encrypted.google.com/#q=cache:http://ryandetzel.com/...
opendomain 1 day ago 0 replies      
Hey Ryan - I would love to help you with Invoice Journal. Contact me at Hacker @ (My username) dot Org
otaku888 1 day ago 1 reply      
I think both sites have just gone down. :)
ahmedaly 1 day ago 1 reply      
the website is very slow.. maybe because loading..
consider moving it to a larger server, and monetize it in somehow to pay off!
Super-resolution from a single image weizmann.ac.il
301 points by nkoren  1 day ago   93 comments top 27
alister 19 hours ago 2 replies      
Dear Mr. Glasner, Bagon, and Irani (authors of the super-resolution paper):

Please apply this technique to resolve a historically important question about the assassination of President John F. Kennedy by identifying the license plate number of the car immediately behind the bus:



Better resolution photos should be available since the ones I've seen printed in books are better than these. And there are plenty of photos of Texas license plates from 1963 with which to seed your algorithm.

The reason this is interesting is because Roger Craig, a Deputy Sheriff in Dallas, after he witnessed the assassination said that he saw Lee Harvey Oswald run from the Texas School Book Depository and get into Nash Rambler station wagon driven by a another man.

These photos show a Nash Rambler station wagon that was indeed passing the Depository just after the assassination. If motor vehicle records still exist for 1963, then its owner can be identified.

The key point is that the existence of a person picking up Oswald after the assassination would strongly indicate the possibility that he was not acting alone.

I'll readily grant that this isn't likely to settle the issue, but I think it still would be amazing if the super-resolution approach was able to generate new information about a topic that no one expects to ever see new information about.

JTxt 1 day ago 4 replies      
I was skeptical of the bottom line of the eye chart image. characters are 3x3 pixels. The o should translate to a dot. there is no hole.

Their explanation for it:

"The lines of letters have been recovered quite well due to the existence of cross-scale patch recurrence in those image areas. However, the small digits on the left margin of the image could not be recovered, since their patches recurrence occurs only within the same (input) scale. Thus their resulting unified SR constraints reduce to the “classical” SR constraints (imposed on multiple patches within the input image). The resulting resolution of the digits is better than the bicubic interpolation, but suffers from the inherent limits of classical SR [3, 14]."

So it's guessing based on the larger characters. Neat.

Here's their last line:


Here's the actual:


(from what I can read.)

( http://1.bp.blogspot.com/-VgvutrSWaFk/T4VI-2tDH2I/AAAAAAAAAX... )

This could probably help ocr.

Other than this, I think Genuine Fractals and BenVista PhotoZoom give similar results.

jawns 1 day ago 1 reply      
I'd be curious to know at what magnification this breaks down, if any. For instance, most of the images have been enlarged by about 3x or 4x. What happens when you blow them up to, say, 20x? It seems like this algorithm does some sort of vectorization -- because look at how crisp the lines are -- which would suggest to me that it scale well. But maybe I'm wrong.
ANH 11 hours ago 0 replies      
The first application that comes to my mind is improving the quality of photos in real estate listings. Half the time it looks like they were taken by a camera phone circa 1995 by someone who was skipping from room to room.
jhuckestein 1 day ago 1 reply      
Some of these images are in uncanny-valley territory for me. The bicubic images just look resized to me, but super-resolution images look real but "wrong". Given how much more "real" they still look than many computer graphics (especially realtime), I guess we won't leave uncanny valley behind us anytime soon.
kemiller 1 day ago 2 replies      
So this means that CSI's "enhance" is actually more possible than reputed.
rhplus 1 day ago 2 replies      
I just noticed that this is from a 2009 paper. Has this method made its way into any products by now?
cypherpnks 3 hours ago 1 reply      
I claim bullshit. The theoretically optimal algorithm is, and has been known to be for 40+ years, sinc interpolation. Gimp does this (Lanczos approximation). Comparing to bicubic leaves aliasing relics and looks bad. I'd be very interested in seeing the algorithm compared to something that's not a pure known strawman.

(In CS terms, this is akin to comparing your algorithm to something using a bubble sort, and ignoring the invention of n log n sorting algorithms)

eridius 1 day ago 1 reply      
Is there any information on how computationally expensive this is?
jjcm 1 day ago 3 replies      
Are there any decent photoshop plugins that will achieve a similar result to this? Would love to have this in my workflow.
CodeCube 1 day ago 2 replies      
Maybe Apple can take this algorithm and use it as the scaling algorithm for low res images on retina displays.
gwern 8 hours ago 0 replies      
Interesting how these super-resolution approaches are slowly inching onto AI/machine learning territory.
petekp 1 day ago 0 replies      
This should be integrated into web browsers. Seems like it would particularly benefit people with high-density displays.
chmike 20 hours ago 0 replies      
Applying this on satellite images with much more data available could yield interesting results. Anyone interested to test this method with google satellite images ?
mistercow 1 day ago 0 replies      
The subjective effect reminds me strongly of 2xSaI and hqnx upscaling filters, although of course those only work well for pixel art.
ck2 1 day ago 0 replies      
It looks like contrasting lines are almost vectorized then expanded?
rburhum 15 hours ago 0 replies      
Is there a BSD/LGPL licensed c/c++/ implementation? I'd love to try it out in some public domain satellite imagery to see the results
sp332 1 day ago 0 replies      
The server is already slow, so if it stops working I think this coral cache version should still load. http://www.wisdom.weizmann.ac.il.nyud.net/~vision/SingleImag...
oulipo 1 day ago 0 replies      
It is the principle of NL-means denoising: since an image is redundant, you use informations about shapes that are similar and are present at different scales in the image to guess the shape of the zoomed version of the smallest ones by using the (more thorough) content of the bigger ones
jostmey 23 hours ago 0 replies      
I am impressed. Still, I suspect all the images shown on the webpage closely follow the frequency spectrum of natural images. Different methods will do better under different frequency spectrums.
t3kcit 4 hours ago 0 replies      
In the context of computer vision, this paper is pretty "old". Since then people have more focused on denoising, though.
wahnfrieden 1 day ago 0 replies      
Are there any implementations of this available?
Sapemeg 1 day ago 0 replies      
i wonder whether this kind of method can be as fast as the older ones?
stevewilhelm 1 day ago 2 replies      
Just a comment to the authors, my I suggest that the example magnified photos default to displaying your SR algorithm instead of NN interpolation.
guard-of-terra 1 day ago 2 replies      
The problem with these: Every half a year we hear about new revolutionary image transformation.
Today it's super-resolution, last year it was low-res PC games graphics interpolation, year before that - filter that could resize images by one direction (e.g. width) while preserving proportions and throwing away less interesting parts. The image will become 2x narrower and you'll hardly notice.

Sadly, this is usually first and last time we see the technology in question. They do not seem to produce any impact that could increase our quality of life. They just sit on some dusty shelves somewhere.

How not using Internet Explorer put me out of touch and cost me dearly blurity.com
296 points by tghw  1 day ago   130 comments top 22
gmurphy 1 day ago 6 replies      
In general, not using the platform your users use is a path to trouble. For example, because so many designers in the valley use Macs, we continually have to fight an OS X bias in our design process; when designing something, you tend to calibrate it against what you're used to, but when OS X is only 5% of the market, OS X-based designers of client software end up with a massive blind spot when it comes to understanding what comes naturally to the rest of the world.

One example is font and font size choices - because the system fonts and font rendering styles differ between platforms, it becomes very hard to tell what looks broken or 'not quite right' on the platform you're not used to. It's not uncommon to see sites launch with font choices that look rubbish on ClearType, but if you're not used to ClearType, it's hard to tell whether the rubbish is your fault or not.

Apple's excellent execution and Windows' (no-longer-deserved) poor reputation also mean you frequently hear excuses for this behavior like "Windows users won't care because they don't care about design" or "The Apple way is better, so we should do it that way on Windows too". Both of these are infuriating and lead to terribly designed products.

dmethvin 1 day ago 3 replies      
If you are creating a Windows binary and expect a user to download it, you should be signing the binary. Period. It's not just IE that considers unsigned downloads suspect, many antivirus programs do as well. If you are proud of your work, sign it.
tomjen3 1 day ago 4 replies      
While that is nice to know, it still smells of Raketeering to me.

"that is some nice software you have there, would be a shame if users thought it was dangerous"

"pay a little money to one of these approved companies and that warning will go away"

If MS was serious about this only being for security they could issue the certificates for free and prove me wrong.

On the other hand, why is it that about 20% of users click past BOTH of these EXTREEMLY scary warnings? Don't they read them at all?

alan_cx 16 hours ago 1 reply      
Just generally, and very simplistically.... buy this or we scare away your customers.

Errr, um, sort of.....well.... Mafia protection racket, yes?

Put it this way. What is the first thing that springs to mind when some one is scaring off your customers demanding, sorry, politely implying a payment to stop?

Yes, yes, yes, I know. Security, user safety, lots of lovely logical arguments for it, Im sure there are plenty. But strip it back to basics and, well, there it is. I presume since MS is a big huge "evil" business which probably funds some political rodent its all cosy and legal.

Its more complicated, right?

krobertson 1 day ago 0 replies      
Better subject: "How not testing my website with all browsers, even IE, and ignoring metrics for months cost me dearly"
Hario 1 day ago 3 replies      
I'm no expert on these sorts of things, but it seems like the story goes something like this:

1. Dev checks out his site using IE

2. Dev realizes that IE users were getting scary warnings about his software

3. Dev has to pay up money to a third company to make the scary warnings go away.

Seems like a bad state of affairs to me.

lini 1 day ago 0 replies      
In windows 8 the smartscreen filter is part if the OS and not just IE. Even if you download unsigned code with another browser, trying to run it will result in the same nasty warnings.
keithnoizu 1 day ago 0 replies      
Although ~half of your revenue came from the IE users the numbers may not significantly improve after resolving this issue as the ie users that completed downloads are also the IE users invested enough in the application to download it despite the ie warning.
wlesieutre 1 day ago 1 reply      
I know some people have voiced concerns about Gatekeeper in 10.8, but this seems at least as bad. Especially from a normal user's perspective.
lisper 1 day ago 1 reply      
Is there a reason you don't offer deblurring as SAAS? I have a photo I'd happily pay to have deblurred, but I use a Mac.
rwallace 22 hours ago 1 reply      
Does it make any difference if you wrap the executable in a zip file, or does IE look inside the zip file and raise an alarm anyway?
fluxon 23 hours ago 1 reply      
On the subject of conversion rates, the Download and Buy links are separate, with no mention of an unregistered trial or watermarked demo mode. That uncertainty might be affecting your tryout rate. If the "Download" button said "Try it!", then the certainty that there is some usable trial would be higher. Side note: I notice that the (watermarked) saved images lack EXIF info - is that preserved in the registered version? This is very important for many photographers...
gubatron 16 hours ago 0 replies      
yup, Microsoft and the companies issuing certificates have been at this for over a year, we had to get a certificate last year when we saw this issue.

It's a nice money maker for them getting all those yearly certificates, some charging several hundreds of dollars per year.

scosman 15 hours ago 0 replies      
Ahh the irony! (the linked article has a SSL warning on IE9 on WinPhone) https://dl.dropbox.com/u/9906763/IMG_7965.JPG
kuhn 1 day ago 1 reply      
Thanks for sharing the numbers. Great to see the process by which you worked out how much it was costing you. Good wakeup call really. Shame it took you so long to cotton on.
raam86 19 hours ago 1 reply      
You have created the perfect layman meter. I can't of any of my friends or collegues who would even consider searching for such a program since in blurring can not be done.
Didnt use your program but this can be proved mathamatically.
These people will never use explorer and even if they would,
They are the kind if crowd who actually reads error messages.

On the other hand you have my grandma,aunt. Random old folks who fall into the red messege = panic & insta call super urgent call to me.

So yea far more layman are using IE
Sent from android.

As for the cert. When you know about you simply explaon this on the page.

jmboling 1 day ago 1 reply      
If you care about your product you have to draw a line somewhere. The more developers that take a stance and don't support the criminal negligence of IE's support of broadly accepted standards the sooner we can all eliminate needless time costs of making sites agnostic to the point of stupidity.
qntmfred 1 day ago 1 reply      
I was browsing HN on my WP7 and this link gave me a "We're having trouble with this site's security certificate" message
lelele 17 hours ago 0 replies      
Reworded: "How not putting yourself into your customer's shoes while you are testing your software will cost you dearly."
drucken 14 hours ago 0 replies      
Just zip it.
vegas 1 day ago 0 replies      
How not watching four hours of television a day put me out out of touch and was really awesome.
Show HN: BrainTripping - language model comedy matt.is
272 points by h34t  1 day ago   41 comments top 29
conesus 1 day ago 0 replies      
BrainTripping is a lot of fun, but what I love about is that it's one of those ideas that is completely unique. Something I've never heard of before and then get to experience something completely new.

Once you get past the learning curve of knowing that the end trip is a mash of your thoughts and the original's writings, it becomes easier to see how to think on BT.

Kudos on the launch, and here's hoping for a bright future.

DanielRibeiro 1 day ago 0 replies      
Pretty cool! There are some hilarious submissions: http://images-cdn.braintripping.com/trips/4f9efd4088cbda0300...
slig 1 day ago 1 reply      
Really nice! Amazing graphics. Did you do them too?
minikomi 1 day ago 0 replies      
This is really fun! I kind of would like to be able to select a few different "expressions" from the character I choose to emphasize the tone of what they say.
mnicole 1 day ago 0 replies      
I laughed at the premise alone, but seeing the submissions and trying it out myself was a blast. Great work on this. I'd almost like to see an X vs. Y poetry slam.
rdl 1 day ago 0 replies      
This is really fun, but coming up with interesting things to say is hard independent of whose brain you are using.
yakshaving 1 day ago 0 replies      
Nice work on this! The illustrations are superb -- Sort of reminds me of the "Fake steve jobs", "Fake Grimlock" and that whole set of cool meme-generating twitter feeds.

Seems a bit overdesigned/overengineered though right? Do you really need the adlib typeahead? Or did you realize that people needed more constraints than just being able to type whatever they wanted into a box?

Seems like you could have an entire conversation between a few famous celebrities (alive or deceased) and then take that conversation page and make it atomic and shareable... and I feel like that would get traction.

Just my 2 cents.

jontang 1 day ago 0 replies      
Total blast! Love how each selection can take the sound bytes into a whole new direction. Well done!
comatose_kid 1 day ago 1 reply      
Would be cool to allow people to have themselves added + feeding the site text written by them. Oh, and you need Gandhi on there.
obliojoe 1 day ago 0 replies      
I have been playing with this for a few weeks now and I've had some really great conversations with people I don't know, using the vocabulary of people who are not me. Surreal and addictive.
rufugee 1 day ago 1 reply      
Mind sharing what OCR software you used? I'm working on project which has just this need.


jawr 1 day ago 0 replies      
Quirky idea, gave it a little whirl. Really love the feel of the website; the graphics are awesome! Nice work.
dwerthen 1 day ago 0 replies      
I enjoyed the way that the "intelligence" of it left a lot of room for ones creativity.
Nice game/tool when one is in need of a little something to get the writing flow going!
caublestone 1 day ago 0 replies      
This is better than anonymity.
delano 1 day ago 1 reply      
Wow, great work! It takes a lot of effort to think and write in character (phrases, vocabulary, and most importantly mood).

I've had a lot of fun tweeting as the old-timey characters for one of my brands (https://www.blamestella.com/news) but it's definitely felt like hard work at times. Huge kudos for really breaking it down and simplifying it.

Axsuul 1 day ago 1 reply      
Really creative idea and +1 for Backbone.js! It seems like your Leaderboard views linger when changing to another view. Otherwise, really polished, good luck!
_pferreir_ 1 day ago 1 reply      
Great! Are you using a markov chain for the suggestions? Is it a "dissociated press" kind of thing?
reddickulous 1 day ago 1 reply      
I want something like this but hooked up to my facebook friends.
mdonahoe 19 hours ago 0 replies      
This rules.

I don't like the text input system.

yycom 10 hours ago 0 replies      
Tutorial doesn't work in ios
NHQ 1 day ago 0 replies      
This is incredible!
oulipo 1 day ago 0 replies      
awesome, I'm currently working on a website to build experiments around the writing process

I have quite a few idea and I like those kind of projects that permit to liberate the creativity (through constraints!)

roycyang 1 day ago 0 replies      
Love the ability to download the trip so I can add it to my own site.
tunnuz 1 day ago 0 replies      
That's a really interesting project. And I love the graphics!
fredsters_s 1 day ago 0 replies      
Love this site
ybother 1 day ago 0 replies      
~or~ how to get a bunch of people to train your neural network
bonaldi 1 day ago 0 replies      
Only Facebook connect or manual login? I choose neither.
Richard Posner: Why There Are Too Many Patents In America theatlantic.com
248 points by joshuahedlund  2 days ago   82 comments top 14
grellas 2 days ago 2 replies      
This piece amounts to a red alert signal from a distinguished judge to Congress that it needs to fix some pernicious elements of the U.S. patent system and that it needs to do so now. The tone is judicious but the message is essentially alarmist: the system is seriously out of whack and Congress needs to get on with fixing it.

Judge Posner admits he is no expert on what the fixes should be and his tentative suggestions for fixing the system are, in my view, decidedly mixed on their merits (e.g., specialized adjudications before the USPTO - remember when it was suggested that a specialized appeals court would improve the patent system and the result was a court that has been so maximalist in its approach to patents that it has in itself become a significant part of the problem).

So where to begin?

Legally, it has to go back to fundamentals and, for me, this has to go back to the scope of patentable subject matter and whether this should be defined to include software at all.

The Patent (and Copyright) Clause of the Constitution (Article I, sec. 8, cl. 8) provides that the Congress shall have the power "to promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries." Note that, in defining this as one of the enumerated powers of the federal legislative branch, the Constitution does not mandate that the legislature provide for patent protection of any sort. It merely permits the exercise of such a power within constitutionally prescribed limits. Thus, any legitimate exercise of patent authority in the U.S. must come from Congress and must respect the constitutional bounds that any grant of patents be for "limited times" and be done in such a way as "to promote the progress of science and useful arts." Legally, then, any patent system in the U.S., if adopted at all, must be authorized and defined by Congress with a view to promoting the progress of science and, implicitly, must employ "limited times" consistent with what it takes to promote scientific progress.

The first issue, then, is whether patents are needed at all to promote the progress of science. In the U.S., in spite of philosophical arguments to the contrary by Jefferson (http://news.ycombinator.com/item?id=1171754), this has never been seriously in dispute. The industrial revolution was already well in progress in 1789, when the Constitution was adopted, and the federal authority, though generally regarded with great wariness at the time, was seen as vital to protect the rights of inventors and to reward them with limited monopoly grants in order to encourage the progress of science. In the first U.S. Patent Act (Act of April 10, 1790, 1 Stat. 109, 110), Congress implemented its constitutional authority to sanction patent monopolies by defining patentable subject matter very broadly, to include "any useful art, manufacture, engine, machine, or device, or any improvement therein." Congress amended the Act in 1793 and then again in 1952, so that today it reads as to the idea of "patentable subject matter" as follows (35 U.S.C. sec. 101): "Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title."

Thus, patents in the U.S. can be granted for any original invention that fits within the definition of patentable subject matter and that also meets the other conditions of the patent act (i.e., that is useful and non-obvious). Note, though, that the 1952 definition of patentable subject matter significantly expanded the scope of such subject matter in the name of bringing the patent laws up to date with developments in then-modern technology, all in the name of promoting the progress of science. It did so by defining patentable subject matter to include any "new and useful process" as well as any "new and useful improvement" of any original invention. Over time, "process" has come to embrace business methods and also software. And the protection of "useful improvements" made clear that new uses of existing machines or processes could be patented notwithstanding older Supreme Court decisions such as Roberts v. Ryer, 91 U.S. 150, 157 (1875) ("it is no new invention to use an old machine for a new purpose").

To promote the progress of science, then, Congress in 1952 allowed patents to be granted for any inventive process and for any inventive new use for any such process. In my view, this generally made sense for what was essentially the continued playing out of the same sort of industrial revolution that animated the original forms of patent protection granted in 1790. Looking at that language at that time, one could readily make the case that patentable processes and improvements thereon could and did promote the progress of science. Discrete inventions tended to be sharply differentiated and tended to involve significant development effort in time and resources. An inventor could keep a process secret and not patent it but the grant of a limited monopoly gave a decided inducement to disclose it to the world and, hence, to expand the broad pool of scientific know-how available to society.

Then came the digital revolution and, with software, a new or improved process can amount to an almost trivial variation on prior art amidst a seemingly endless stream of improvements developed in ever-rapid succession and with little or no capital investment beyond what developers would be motivated to do for reasons entirely independent of gaining monopoly protection for the fruits of their efforts. Moreover, there is little that is closed about such innovations: a wide knowledge base is in place, known to an international developer community that is basically scratching its collective head asking why it should be restricted legally from using techniques and processes that represent common knowledge in the field.

The main question, then, concerning software patents, is whether the existing framework makes sense as one that promotes the "progress of science" insofar as it grants patent protection to process inventions in this area. Congress needs to seriously ask itself that question. A second question, also tied to constitutional authority and assuming that it is legitimate to grant some form of patent for such inventions, is whether a 20-year period of exclusivity makes sense in an area where innovation occurs at blazing speeds and with not too much capital investment tied specifically to any given discrete invention. Is that necessary to promote the progress of science? That too is a question that Congress needs to consider.

Thus: (1) there is nothing magical about the current definition of patentable subject matter and Congress can adapt this to suit the needs of the time in promoting the progress of science, (2) process patents are in themselves a fairly recent phenomenon (at least in any large numbers) and it is no radical change to curtail them in areas where they make little or no sense in light of the constitutional purpose for why patents even exist in the first place, and (3) legitimate patent reform needs to go far beyond procedural fixes around the edges of the system and needs to focus on the realities of modern technology and whether the patent laws further or impede the progress of science as applied.

The policy debate can and will go all over the board on this but, if it is framed in light of the constitutional foundation for having patents in the first place, it can be shaped in a way that puts the focus on the fundamentals of what needs to be fixed as opposed to lesser issues that do not get to the heart of the problem. The main problem today is the blizzard of vague and often useless patents in the area of software. These are effectively choking all sorts of innovation and are benefiting mainly lawyers, trolls, and others who do not further technological development by what they do. It is a mistake, in my view, then, to swing too broadly in trying to fix things (as by advocating abolition of all patents) or to be so timid about the issues that reform is marginal at best and ineffective in dealing with the current crisis of an over-abundance of essentially worthless patents. Congress embraces the patent system as a whole and shows no hostility to its fundamentals. Reform must be shaped in light of those fundamentals but it must, at the same time, be meaningful to eliminate the main garbage from the current flawed system. Judge Posner has pointed the way generally and proponents of reform ought to follow his lead, with the focus being (in my view) on software.

cletus 2 days ago  replies      
Posner may well be one of the most important figures in tech in the coming decade for standing up against the lunacy of software patents. What Congress and the President don't seem to understand is the cost of patent litigation in the US poses an existential threat to America's dominant position in tech.

One of the most compelling arguments to me (against these patents) is that in pharmaceuticals, for example, you are dealing with a handful of patents. Some processes might be patented, maybe even some equipment (easily licensed generally) but basically the patents that go into a process (that may itself be patented) are minimal and can be reasonably well understood by those running such businesses.

Posner pointed out that a smartphone may well contain (and violate) thousands of patents. That right there is a sure sign that something is rotten in the state of the patent system.

The solution here isn't reform, as some suggest (ie raising the bar to what's patentable). It's simply to get rid of them. First-to-market and execution are what matters and what should matter. 20 year exclusives for vaguely worded patents on things that are more often than not obvious is just a means for big companies to extinguish smaller companies.

mtgx 2 days ago 4 replies      
"In most [industries], the cost of invention is low; or just being first confers a durable competitive advantage because consumers associate the inventing company's brand name with the product itself; or just being first gives the first company in the market a head start in reducing its costs as it becomes more experienced at producing and marketing the product; or the product will be superseded soon anyway, so there's no point to a patent monopoly that will last 20 years; or some or all of these factors are present. Most industries could get along fine without patent protection."

Wow, this guy really gets it. This is how markets and competition work. There's no need to give a company a legal monopoly. If anything, that lack of monopoly, will force companies to keep trying to invent new things to keep staying one step ahead of the competitors.

I also love this one:

"forbidding patent trolling by requiring the patentee to produce the patented invention within a specified period, or lose the patent"

These days big tech corporations are filing patents as fast as they can print them on paper. And then 95% of them will probably never be used in products that are shipping in the market.

jpadkins 2 days ago 1 reply      
I used to be like Richard Posner, where I was generally against patents except for a few cases like pharmaceutical. Until I read Against Intellectual Monopoly. http://levine.sscnet.ucla.edu/general/intellectual/againstne...

See chapter 9 for an historical analysis of the pharmaceutical industry in countries without patents. The surprising result is companies no-patent protection countries were producing equivalent new drugs as the patent protected companies.

Now I am full anti-IP advocate, except for certain trademarks and attribution of authorship (so people know who the company/author this product came from).

kiba 2 days ago 1 reply      
Whenever a congressman or members of an executive branch do something, I usually hate their gut.

Whenever a judge decide something, it usually make me like them.

In fact, Americans trust their judges more than their politicans and bureaucrat. http://www.gallup.com/poll/143225/trust-legislative-branch-f...

BenoitEssiambre 1 day ago 0 replies      
In my opinion, the key characteristic of software that makes the current patent system not suitable is the vast amount of interdependence inherent in this type of technology. Code is not used side by side like an infantry of little computer processes working in parallel to make computers or phones go. It is rather organised and packaged in a huge network of building blocks, a pyramid of hundreds of thousands of libraries, APIs or functions heavily inter-dependent on each other. What's more, the building blocks are usually not all written by the same people or organisations and a consistent interface is critical for enabling compatibility (often between millions of parts made by thousands of developers).

This hierarchical and networked architecture is inevitable and it is the best way to organize such complex information, however the stability it requires at the bottom of the pyramid of code means that some building blocks cannot be changed once the pyramid is built. Someone claiming ownership of the shape of a bottom-center block after the pyramid is built, someone having the power to force a bottom block to be removed and replaced with a different shaped one, no matter how simple and obvious this bloc is, does not have power over just this block but over the whole structure above it and all the components that depend on it. This means patent holders have a disproportionately large amount of power when they target such a bloc. From a possibly trivial piece at the bottom, they can control a vastly more sophisticated structure built on top which they had no part in conceiving. They know that changing it would require tearing down, redesigning and replacing often tons of dependent work and probably break compatibility for huge amounts of users of these projects.

WalterBright 2 days ago 1 reply      
Before 1989 or so, software was assumed to be not patentable. This did not appear to slow down innovation or progress in software in the slightest.
guygurari 2 days ago 0 replies      
"There are a variety of measures that could be taken to alleviate the problems I've described. They include: reducing the patent term for inventors in industries that do not have the peculiar characteristics of pharmaceuticals that I described; ..."

To me the obvious solution, and the one missing from this list, is to abolish patents altogether in such industries ( including the tech industry). I wonder if judge Posner would agree, and if so, why not come out and say it? Would this be considered too radical at this point in time?

vibrunazo 2 days ago 0 replies      
I'm extremely skeptic of the proposed solutions. I haven't yet seen a solution that would be a clear net win for society after summing the pros vs cons. It seems to me that fixing patents is a mathematical impossibility. Trying to come up with a system that forces most inventors to pay a few inventors, while at the same time not punishing most inventors. Sounds like trying to come up with a number that is less than 2, while at the same time greater than 1000. It's mathematically impossible.

The optimistic in me would love to believe there's a brilliant solution, which is way over my head. The realistic in me, can only see paradoxes and no obvious solution. Maybe I'm just too dumb to solve this problem myself.

I believe the right path is to look back at what the vision behind patents are in the first place (incentives for invention), and think from the ground up how we can implement this without the modern "necessary" dogmas (such as licensing or IP). Then I can actually think of plenty of solutions. But none of them even remotely resembles what we know today as a patent.

clarle 2 days ago 0 replies      
Great points all around, but I don't necessarily agree 100% with his thoughts in the pharmaceutical industry.

For specific drugs, this may be the case, but when you have pharmaceutical companies doing things like patenting specific gene sequences, causing both other companies and academics to have to get licenses/permission just to perform research on something completely different, that's just ridiculous.

How patents work should be more flexible, and not limited to just whatever industry they're in.

rogerchucker 2 days ago 0 replies      
Isn't the fundamental problem in software patents that patent holders have the option of NOT licensing their patents to infringing parties? Outright ban based on patent infringement is criminal in my view. Collecting rent based on infringement is completely fair, on the other hand.
keithpeter 2 days ago 1 reply      
"...eliminating court trials including jury trials in patent cases by expanding the authority and procedures of the Patent and Trademark Office to make it the trier of patent cases, subject to limited appellate review in the courts..."

UK perspective: what do people here think of this suggestion, perhaps even as a temporary 'damper' on the patent troll business model? Raising the barrier to litigation would perhaps slow down the rate at which these cases occur. In the UK, we have a special court for trying IP cases, and the barrier to litigation is very high, perhaps too high for some small companies. Of course, the EU does not allow the granting of software patents.

Zenst 2 days ago 5 replies      
What I don't understand and also feel is a issue with patents as a whole is you can patent something without actualy being able to show a working example/product.

For example :- somebody could patent teleportation - define it and then when somebody does all the hard work and actualy invents a teleporter, you are then able to cry patent violation and cash in. That too me is compeletely wrong, yet that is how the patent system stands currently.

I have also noted that alot of patents that have no working prototype or product, all seem to have been done in some SCI-FI movie/TV series priviously and find it somehow suprising that the movie industry have not started jumping on this patent bandwagon as they have more of a working prototype than many awarded patents that get approved in this day and age.

RockofStrength 1 day ago 0 replies      
The good thing about patents is that they motivate the discovery of alternate approaches. The bad thing is that they stagnate inter-connective progress.

Patents have become a hideously bureaucratic market unto themselves, creating a kink in the hose leading to the fountain of progress, but the fountain attained its magisterial beauty partially as a result of the motivation to circumvent roadblocks.

Betaworks has acquired the core assets of Digg betaworks.com
241 points by mikerice  2 days ago   128 comments top 22
alaskamiller 2 days ago  replies      
Reddit won the game, slow and steady. Reddit now literally powers the generation of news cycles across the blogerverse. BuzzFeed? HuffPo? Gawker? TheDaily? Reddit, yesterday.

First thing first is rescue the culture. Digg engineering already fled, the peanut gallery in the valley keeps squawking with delight about your failure, and all you have left is some nicely designed pages showing double digit gains on stale links that's a shade away from being as if they were spun up by a Russian spam squad. Save yourself, redo the logo, redo the color scheme, don't let legacy drag you down.

Second thing is do some soul searching, figuring out what layer Digg wants to play in. Links aggregation? Community building and content creation? Traffic? Attention? Engagement? That elusive ad sharing model for content creators?

After that figure out for who. Reddit's core is dying, and eventually it'll be fully crowded out by the mainstreaming of rage comics, aww pics, and counter-Tumblr-pseudo-nerdy programming. Do you target those people, the walking wounded much like Slashdot?

Or do you go after the youths, the ones addicted to Instagrams, 9gag, imgur, but bored of their Facebook feeds? Then play the waiting game. New is everything old, after all.

The game's so different now from 2005. All the majors have feeds now. All the majors have figured out sharing, commenting, and extracting action on a story item. On top of that you're competing against mobile guys like Flipboard.

This is one of those situations where execution is easier than creating an idea. What is Digg's $1MM idea? Good luck with that because Digg needs to be futuristic but also really lucky twice being at the right place, the right time, with the right idea. Once you're lucky, twice you're good, right?

I bet this kind of flame out, this one thing keeps Mark Zuckerberg from sleeping.

jgroome 2 days ago 5 replies      
> The News.me team will take Digg back to its essence: the best place to find, read and share the stories the internet is talking about. Right now.

With my cynicism hat on nice and tight, it seems a bit late for that. Digg WAS, in its heyday, the best place to find and share online content. But as the management undertook on iffy decision after another, and as imitators sprung up left right and centre, it lost its place in the online world. Sad.

We have plenty of content aggregators now. The kind of people Digg 5.0 (?!) will target are probably more than happy finding their entertainment through Twitter, Facebook, and dare I say it, Reddit.

Having said that, I for one wish the new team all the luck in the world in trying to get the site back to its former glory, and I look forward to trying it out.

uptown 2 days ago 8 replies      
From WSJ: "The price was just $500,000, three people familiar with the matter said"a pittance for a company that raised $45 million from prominent investors including LinkedIn founder Reid Hoffman and Marc Andreessen."


marknutter 2 days ago 2 replies      
This is a case of a company having delusions of grandeur. Instead of humming along with 10 or so employees raking in cash hand-over fist, they tried to grow as fast and as large as possible to try to revolutionize news. Instead, Reddit did the former and is stable and profitable; it never pretends to be something it's not.

Not all companies are going to change the world, nor should they. I see Facebook and Twitter making the same mistakes. Perhaps Facebook is a great place to keep in touch with friends but nothing more. Perhaps Twitter is simply a great replacement for blogging, nothing more. Instead of filling their niches and quietly making a small number of employees very wealthy, they try to become these massive institutions that are reliant on very fickle user bases. It's a house of cards.

adventureful 2 days ago 2 replies      
The only reason it's $500k and not $0 is to save a small amount of face. Digg is losing money, and has never made money in its entire history, so it's going to cost Betaworks a lot more than $500k to take it over.
bitwize 2 days ago 0 replies      
Why? Reddit is like Digg 2 now. Back in the day Digg was for dudebros who wanted to swap girls gone wild links while Reddit was the "isn't Haskell grand?" place. But since about 2008 or so when Reddit rubbed up against Digg and 4chan and a big chunk of their user base rubbed off and clung to it, Digg has been made wholly redundant.
chimi 2 days ago 3 replies      
Sorry for my ignorance. I've been searching for exactly what went wrong with the Digg redesign, but I can't find anything that really spells it out. Something with side-by-side comparisons or a deeper analysis than "Users leave in drove after Digg redesign that alienates user base." How did it alienate those users? Of course I was aware that it happened, but because I wasn't part of the community, I never understood what it was, exactly.

Also, is this an example that illustrates why craigslist refuses to redesign their site? A redesign really could spell the end of craiglist like digg.

spinchange 2 days ago 3 replies      
I used to love digg in the early days. People point to the v4 revision as being the final nail, but they started wavering and losing the magic literal years before that.

FWIW, Pligg (the digg clone CMS) looks like it's getting a twitter bootstrap upgrade. I've been wanting to roll my own "digg/reddit/HN" for ages. Maybe now's the time.

untog 2 days ago 0 replies      
I don't doubt that something can be done with it, but I'm dubious of Digg as a brand these days. Similar to MySpace, it's lost all the credibility it once had.
Tobu 2 days ago 0 replies      
Digg is a shadow of an online community, but its fall gave us War, a three-part social media epic by ncomment: https://secure.flickr.com/photos/25036088@N06/3424896427/siz...
cantbecool 2 days ago 0 replies      
The first iteration of digg looks awfully like reddit.


dr_ 2 days ago 0 replies      
From techcrunch: "Betaworks, says Digg, will soon unveil a new “cloud-based version of Digg” that will complement News.me's iPhone and iPad apps"
Not sure what this means, Digg was never a desktop client.
It's betaworks using lingo that will grab the attention of non tech readers to suggest something interesting is going on. But it isn't. Just like with a lot of their other stuff. Bitly is a simple concept which has been done many times. News.me is nothing interesting at all, just an attempt to rip off Flipboard. Chartbeat is just another way to visualize data that's available by so many other means.
Nothing original with betaworks.
zinssmeister 2 days ago 0 replies      
the german Digg clone Yigg.de also sold today. Funny how that timing works out. (http://yigg.de/nachrichten/2012/07/12/ekaabo-uebernimmt-yigg)
fein 2 days ago 1 reply      
I am incredibly skeptical about the plausibility of resurrecting Digg, but for 500k I sure hope they do. It was my first social media home and it would be nice to be able to go back to what it was; not a marketing front end for corporate prostitution.

Fingers crossed, but betaworks seems to want to take Digg in the direction it should have gone during the v2->v3 upgrade.

CubicleNinjas 2 days ago 0 replies      
How the mighty have fallen...
PaulHoule 2 days ago 2 replies      
the trouble is that fixing Digg means getting rid of the existing user base
saurik 2 days ago 1 reply      
I wonder what they consider the "core assets" to be; some specific items I'm curious about: domain name, codebase, hosting contracts, Aeron chairs, employees.
paulbjensen 2 days ago 0 replies      
This should be a film, the story has everything: business press coverage, the BluRay code event, death threats to the founder, a near-deal with Google, the exit of the CEO, the 4.0 debacle, the company downsizes, the founder leaves, the company struggles, then the company sells it's shell.

I hope BusinessWeek does a piece on this.

dreadsword 2 days ago 1 reply      
$500k: hard to tell if they've over- or under- paid. What's the Digg Balance Sheet look like? How about the last year of cash flow statements? Without those, its impossible to evaluate the worthiness of the deal.
synack 2 days ago 0 replies      
I imagine it'll end up costing them quite a bit more than $500k when the dust settles... Digg probably has quite a bit of debt racked up by now.
anewguy 2 days ago 1 reply      
How much do you think it would cost to buy reddit?
ruhsler 1 day ago 0 replies      
news.me + digg could be good
Some things I've learnt about programming jgc.org
240 points by jgrahamc  2 days ago   123 comments top 20
AngryParsley 2 days ago  replies      
Programming may be a craft, but researchers have published tons of studies about this craft. Many of these studies contradict anecdotal evidence. For example, copying code isn't as bad as you might think: http://www.neverworkintheory.org/?p=102

Another example is TDD. People espouse the benefits, then some study comes along (http://www.neverworkintheory.org/?p=139) saying the benefits are largely illusory and that code reviews are more effective.

Instead of listening to the experts at programming, listen to the experts on programming. Read some studies about the effectiveness of various tools and methods. Try new things. Programming is a craft, and like many crafts it contains significant amounts of dogma passed from teacher to apprentice.

defdac 2 days ago 10 replies      
"You have to ask yourself: "Do I understand why my program is doing X?""
"I almost never use a debugger."
"And if your program is so complex that you need a debugger [simplify]"

To me, being afraid of a debugger is like being afraid of actually knowing exactly what is going on - being lazy and just read logs and guessing what might have gone wrong, instead of letting the debugger scream in your face all the idiotic mistakes you have made.

I would argue that using the debugger is being lazy in an intelligent way, instead of spending hours reading endless logs trying to puzzle together logic the debugger can show you directly.

bambax 2 days ago 0 replies      
A lot of those things apply to many human activities.

I don't build bridges but I would be very surprised if an architect described his work as "pure science and no craft at all" (how would it be possible, then, to build beautiful / ugly bridges?)

I do a little woodworking and have many tools; friends sometimes look at my shop and ask if I really need all that -- yes, I do. In the course of a project you get to use many different tools. You can get around to missing one but it takes exponentially longer to work with not the exact tool. (Same thing with photography).

I'm learning to fly, and the most important word regarding human factors is "honesty". The way to fly is not to avoid mistakes, it's to detect them and minimize the consequences; if you feel you can do no wrong you'll eventually kill yourself.

vbtemp 2 days ago 1 reply      
> 0. Programming is a craft not science or engineering

Unless, of course, you are software engineer-ing.

Flight guidance-and-control systems, among many other things, are are precisely engineered software systems. In a world of web-apps and mobile-apps, people tend to forget this kind of software exists.

Sure, working on your web app, writing some JQuery widgets, or coding up some python scripts is a craft.

StavrosK 2 days ago 8 replies      
I never noticed until now, but I never use a debugger either. An employer urged me to start using one while I was working on their code, but it's just not natural for me. If I need to debug a crash, I just print the 1-2 variables of the state involved, see what's wrong, and fix it.

Is there anyone who uses a debugger for more than inspecting state?

EDIT: I guess lower level languages and more involved applications use debuggers much more extensively.

asto 2 days ago 4 replies      
A large part of the post can be rewritten as "don't be lazy".

1. Don't be lazy and just do something that works without taking the time to learn why it works.

2. Don't be lazy and just stop when you have something that works. Go through the code again and see if you can make it better.

4. If you find yourself writing the same thing twice, don't be lazy and carry on, put the code in a single place and call it from where you need it.

Or at least that's how I see it. I do all of the things I shouldn't do, largely because doing things the wrong way is so much easier!

Edit: Rather than rewritten, I meant "falls under the general category of". The article was great!

zethraeus 2 days ago 0 replies      
It's nice to see number 5.

>Be promiscuous with languages

>I hate language wars. ... you're arguing about the wrong thing.

It's easy to take this for granted, but it's a concept that is very important to stress to new coders. If you spend too much time focusing on one language you run the risk of the form becoming the logic . This is a dangerous place where your work can be better analogized to muscle memory than to logical thought.

At least in a college environment, I think this lack of plasticity causes discomfort with different representations of similar logic - and so flame wars abound.

wissler 2 days ago 1 reply      
Taking personal pride in not using a debugger is a bad idea. Sometimes it's the right tool for the job, and if your picking it up makes you feel dirty, you're only handicapping yourself.
einhverfr 2 days ago 0 replies      
These are all golden lessons that people who think about writing code generally learn.

One thing I would add though is that there are many times when there is time pressure and a kludge works. The right thing to do here is to document that it is a kludge so that if/when it bites you later you have a comment that attracts your attention to it.

"I don't understand why this fixes the problem of X but this seems to work" is a perfectly good comment. It's great to admit in your comments what you don't know. (That's why questions relating to commenting are great interview questions IMO.)

Finally, I think it's important in the process of simplification to periodically revisit and refactor old code to ensure it is consistent with the rest of the project. This should be an ongoing gradual task.

Anyway, great article.

Tashtego 2 days ago 0 replies      
"I'm not young enough to know everything"

Having recently started mentoring/managing the first really junior engineer on our team (self-taught, <1 year programming experience), boy does this ring true. Luckily I'm of the temperament to find the "advanced beginner" stage of learning more funny than annoying.

I think it's possible to understand as little about your code when using loggers as when using debuggers, so I have a hard time agreeing with him there. I think his general point about having tools and knowing when to use them applies just as much to that as it does to language, so he contradicts himself.

bromagosa 2 days ago 0 replies      
The debugger part doesn't look generic enough to me. As a Smalltalk programmer, I can only say usage of debuggers depends _a lot_ on which language you code in.

In Smalltalk, you practically live inside the debugger. Also, if you are an ASM programmer, the debugger is indispensable.

10098 2 days ago 0 replies      
> I almost never use a debugger. I make sure my programs produce log output and I make sure to know what my programs do.

I used to do precisely that. Sprinkle code with log messages, recompile and run. When I finally learned how to use gdb, my debugging productivity increased tenfold.

I mean, just the ability to stop your program at any given point gives you an enormous advantage. You can not only examine the local state of your program, but also you can see how the state of systems outside of your program (e.g. database) changes, and all of this without polluting the code with tons of useless debug messages.

Often when I had new ideas during bug hunts, to test my hypothesis without a debugger, I had to go back and add new logs, the recompile, then run (and make sure it reaches the same state as before!) - lots of wasted time. With a decent debugger it's as easy as typing an expression.

And I don't think debuggers lead to lazy thinking. The process of finding the problem is the same whatever method you use - you analyze the code, have an idea about what could be wrong, change one thing, then see what happens. Debuggers just make it easier.

raheemm 2 days ago 2 replies      
#7. Learn the layers. Is this even possible anymore? Seems like with apis, frameworks, there are layers upon layers just within code. Then you have the OS, the hardware, the network layer (whoa! 7 layers right there!)...
mustafa0x 2 days ago 2 replies      
> Programming is much closer to a craft than a science or engineering discipline. It's a combination of skill and experience expressed through tools

You seem be implying that the latter statement doesn't apply to the disciplines of science and engineering. "skill and experience expressed through tools" is highly important in both watchmaking and bridge building. I would advise anyone who says elsewise to reconsider.

I understand your point, but why create a hugely false dichotomy between a craft discipline and the science and engineering disciplines?


I strongly concur with points 2 and 6.

HarrietJones 2 days ago 1 reply      
Good article, though I don't agree with it all.

It is harder to grow software than it is to initially build it. Preconceptions bite you on the ass, data structures don't allow for new features, side effects multiply.

You don't need to learn the layers. In fact, if you're learning all the layers, you're probably an innefective coder. This is not to say that you shouldn't investigate the layers or have a poke around the layers. But, software's about reuse and reuse is about reusing other people's work via known interfaces without worrying overmuch about what goes on underneath the hood.

I'm actually more of a debugger than a profiler, and as much as I'd like to believe that my way is as valid as his, I suspect that he's probably right on this and I'm probably wrong.

MindTwister 2 days ago 0 replies      
9. You count from 0
alainbryden 2 days ago 0 replies      
I like that you starting numbering at 0 - very apropos.
plg 2 days ago 1 reply      
Is "learnt" a real word?
cryptide 2 days ago 2 replies      
>>a printf that's inserted that causes a program to stop crashing.


pjmlp 2 days ago 0 replies      
Quite interesting
The Silent Majority of Experts dadgum.com
224 points by pmarin  19 hours ago   95 comments top 18
gmurphy 12 hours ago 4 replies      
We call this the 'dark matter' problem - when you're trying to hire great people, for every well-known rockstar, there are tens of people who are better, toiling away in obscurity.

The early JavaScript scene was like this - there were people who published amazing JS work and everyone on DHTMLCentral knew who they were, and it was easy to find them and try to hire them. Yet we felt this huge frustration because occasionally we'd interview someone who was as good or better, but we had only found them by accident.

I'm excited about sites like Dribbble and GitHub, which make it easy to showcase your work without having to go through the pantomime of a large folio piece. Though it can be a pain for introverts to do, I've never met anyone who regretted overpublishing their stuff.

kamaal 7 hours ago 0 replies      
Even as I'm writing this I'm at a hacknight in Bangalore, a over night even on BigData. We are hacking away to learn and make a lot of things.

One thing that I observed today was an old friend and colleague sitting next to me. He sort of wrote a Perl program to analyze a weeks worth of tweets in India and draw real cool visualization out of it. The thing is script was not really big and was full of functional programming stuff.

As I was reading the script with another friend, My friend was like 'Oh! I though SQL is unpopular these days, or Perl was dead, or no one uses functional programming these days'. I soon realized- For each of these cries, some one is still sitting and doing awesome stuff with things and using the most simplest and basic tools.

Vast majority of awesome guys don't blog or tweet or harp about every bit of tiny things they have achieved on the Cyber space. But your everyday guy is vastly influenced by ranting, blogging and tweeting and takes them to be holy! And anything other thing than that- which produces results is generally a 'black swan' event to him.

So when something unpopular is used to get the job done, its often difficult to the everyday guy to understand this.

dkhenry 8 hours ago 0 replies      
If the point of this article is to say that you should not be spending time worrying about and evangelizing your tools and more time using them to create awesome stuff then I agree. However if its to say that there is a large crowd of exceptional programmers and engineers that no one hears about because they spend _too_ much time creating great projects then this is baloney.

There are some people out there who are truly exceptional and might never be heard from. However in general the real exceptional people are the ones pushing the envelope of whats possible forward. Their work _always_ finds the limelight. The author uses an example of a rigid body physics implementation in a game, and while that's great work and the team should congratulate themselves its not that big of a deal BLAS had been around for decades by that point. That does not describe experts doing amazing things. It describes software developers doing a good job and not being cocky about it.

I think why this has received such a warm welcome here is so many of us software developers want to think we are part of that silent majority. That we are doing amazing things too and if we cared to we could write articles and become famous. The truth is we are not experts we are people who have jobs to feed our families and we may be just as skilled and talanted as the vocal minority, but we would never be able to reproduce their success. That's not a bad thing. I have no problem being an unknown developer working on a piece of corporate software and providing a good living for my family. Its ok if others get press time for their contributions I might take from them and learn, but it doesn't change the fact that they at the end of the day have moved our profession further and I haven't.

zerostar07 17 hours ago 3 replies      
Blogging also takes confidence, that you have something substantial to say and that what you say is valid. Unfortunately, due to the Dunning-Kruger effect[1], actual smart people have too much self-doubt, which would imply that not-so-smart people are overrepresented in the blogosphere/google search/messageboards. Couple that with the internet's popularity-contest dynamics and you have a larger problem than earlier publishing media (which required a substantial level of peer-review-by-experts)/

Take home message: read everything on the net with a grain of salt.

[1] http://en.wikipedia.org/wiki/Dunning%E2%80%93Kruger_effect

mjn 16 hours ago 0 replies      
Incentives also tend to disfavor "popular" work like blogging, but I think in parts of academia that is somewhat changing. May be changing less slowly with regard to experts in industry, some of whom also have confidentiality issues to worry about.

As one example, law-professor blogs have really taken off. They're good for building reputation and making a legal argument actually have real-world impact (people read them, whereas the average law review article is little read), and sometimes have even been cited in court decisions! So there is a movement towards law professors doing more popularly oriented, but still legally sound, writing; and towards universities taking this into account.

Science has also long had some research/popular-writing crossover, and there are a number of researchers who blog in areas like physics and climate science. And mathematics has an increasingly strong blogging culture, with Terence Tao setting the nearly impossible to top example. So I think there is good stuff out there if you know where to look. Of course, it's also good to experiment on your own, and also to look elsewhere (e.g., in books).

16s 15 hours ago 6 replies      
I liked this sentence. It's so very true.

"Just because looking down your nose at C++ or Perl is the popular opinion doesn't mean that those languages aren't being used by very smart folks to build amazing, finely crafted software."

alan_cx 16 hours ago 0 replies      
There is a more general point here. There seems to be a growing assumption that activity on the internet is some how proportional to activity in real life. Yes this could be a problem for programmers, but IMHO, its becoming more and more of a problem with news gathering. Issues or causes with facebook and twitter accounts get far more coverage than those with out. Basically these sites provide tittle tattle to fill the 24hr news channels. OK, cool if you happen to be in a banana republic dictatorship with internet access, but deep in Africa, your tribe can be slaughtered and no one knows.
shrughes 15 hours ago 1 reply      
Implicitly in contrast is a loud minority of newbs, with their resume-building blogs that have an about-me page, a picture of themselves, and reposting/paraphrases of old Joel on Software posts.
yen223 17 hours ago 2 replies      
"An appealing theory that gets frantically upvoted may have well-understood but non-obvious drawbacks."

I'm reminded of the quote, "To every problem there is a solution that is simple, elegant, and wrong." I have fallen into the trap of looking at a long, messy piece of code, thinking "I can do better than this". I would replace a 50-line code with 5 lines, only to have it fail at some random edge case, which the original has been fixed for.

That is why I always remain skeptical of people who are out to "disrupt" an industry, especially when they don't have much experience in that industry.

rbanffy 14 hours ago 0 replies      
I signed so many NDAs in the past 4 years it almost looks like I did nothing.
dsirijus 13 hours ago 1 reply      
One thing is being a good programmer and wholly another selling oneself as good programmer. And that's absolutely not sad state of affairs; it is how it is and it is how it should be.
nathancahill 10 hours ago 0 replies      
No doubt a lot of expertise is locked away in commercial software, but the open source community has equally (or more) impressive experts working on software. Maybe not blogging about it, but the expertise is definitely there.
n0on3 10 hours ago 0 replies      
The thing is: (blogging|writing)-about-/publishing- code takes time.
And time is really what people doing amazing stuff are short on the most.
Don't get me wrong, both discussing and publishing are amazing,
but often life is just too short for the blah blah blah,
unless you have good reasons for doing that.
OmegaHN 13 hours ago 0 replies      
I think this problem is why academia is so valuable; academics sole purpose is to share their discoveries. What they build is not a product, but rather a set of ideas ( or others' ideas) to share with the public
tubbo 11 hours ago 1 reply      
I think this has more to do with age than anything else. But it might also be because you can do stuff faster in Ruby or Python or JavaScript so you have all this time to wax poetic about it. ;)

There's still nothing like C/C++ for getting close to the metal. And until we have an OS written in Ruby or Python, rather than C, it's gonna be like that forever.

lani 15 hours ago 0 replies      
if only I'd read this article two - maybe five years ago ...
smeg 17 hours ago 1 reply      
People still building software in Perl ought to be arrested.
dkhenry 16 hours ago 3 replies      
Is it a silent Majority? I don't think anyone will doubt that not every interesting thing that gets done gets published and discussed on the internet and on blogs, but I don't know if I would say its a majority. From my experience there are some people in the software world who think they are either not noteworthy enough to write about what their doing or they are too noteworthy ( I am not going to share my amazing super advantage I got by making this really smart and novel thing). I have found however that in general software development, like any technical trade, works best when its subject to peer review and an open community. I think most people also know this, which is why the tech heavyweights have talked about their infrastructure and technique. They do it so others can build on what they have done and as a community we can build on each other.

There are a few who don't care to participate thinking they are either above it or below it , but the truth is most people do and everyone should.

John Resig: Secrets of the JavaScript Ninja Update ejohn.org
206 points by shill  3 days ago   64 comments top 11
javajosh 3 days ago  replies      
I really admire Resig, and this post contains a few gems:

1. Knew about PPK[0], didn't know about JZ[1]

2. Feature detection came about from writing this book! Which generalizes too: writing is a great way to innovate (something pg has said, too).

3. A nice little jab at GRRM, who thoroughly deserves it.

4. An exciting glimpse into the future of computer science pedagogy by way of JavaScript in the browser. (And I can't help but wonder if this is an oblique response to the recent Alan Kay Dr. Dobbs interview, where Kay asserts, "The Web, in comparison, is a joke. The Web was done by amateurs."[2])

5. Points out that even the best of us can get stuck in a procrastination loop.

6. Resig is going to finish a JavaScript book [3]

[0] http://www.quirksmode.org/

[1] http://perfectionkills.com/

[2] http://www.drdobbs.com/architecture-and-design/interview-wit...

[3] http://jsninja.com/

ck2 3 days ago 0 replies      
It's been so long I completely forgot I bought the book!

The best way to get a project done is to

   1. need the money
2. not get paid until you finish

If either of those elements are missing, expect delays ;-)

danso 3 days ago 3 replies      
I've written a few short-ebook-like technical posts and it still surprises me the number of people who ask for when the PDF format is coming... I get that not everyone can have access to a web browser at every occasion they have for reading...but the web provides a much better canvas for illustrating technical concepts, especially with the ability to create interactive bits with JS. Save the couch reading for leisure/non-technical reading, and read technical pieces when you're on the computer (and have ready access to Google/StackOverflow).

On a selfish note, writing HTML seems to be much easier than producing a properly formatted PDF page. So I enjoy technical writing, but not enough to go through the hoops of for-paper (or paper like e-books) publishing.

r00fus 3 days ago 0 replies      
John is a really great guy and has done (at least for me) more to advance the knowledge of javascript than any other person I know.

Very fitting that he's now at Khan Academy.

I've personally bought the jsninja ebook for friends and myself because I thought it was a great read and wanted to contribute to this kind of effort.

ville 3 days ago 3 replies      
I haven't read the book, but this praise of with statements struck me as odd: "The portions of this book that cover features that are relatively un-changing, such as code evaluation, with statements, and timers are continually being used in interesting ways." There also seems to be even a whole chapter in the book dedicated to with statements. What are those interesting and great applications that justify this feature, being considered harmful[0] and even removed from the language in strict mode of ES5?

[0] http://www.yuiblog.com/blog/2006/04/11/with-statement-consid...

kevinpet 3 days ago 0 replies      
I was looking forward to learning how to do a javascript ninja update, which I assumed might mean some kind of "reload the latest version of the JS while the user is on the page".
cdr 3 days ago 0 replies      
While it's disappointing that it's taken this long for the book to get finished, the publisher Manning has provided downloadable drafts throughout the whole process - a great feature. Thanks to that I've had access to the book (such as it's been) for pretty much the entire time, and that's what pushed me to buy it when I did.

Can't wait till my paper copy ships - guess I better make sure my address is accurate still :)

twfarland 3 days ago 0 replies      
Really respect that he chose to prioritise his personal life. Reminds me of the lesson I learnt from Herbert and Harry as a child. http://www.penguin.com.au/products/9780140567830/herbert-har...
anthonyb 3 days ago 0 replies      
Congratulations! Writing a book is insanely hard.
eswangren 3 days ago 0 replies      
Is a "JavaScript ninja" conceptually similar to a "my little pony samurai"?
cheap 3 days ago 1 reply      
Too bad I cancelled my order years ago...
Introducing Mozilla Persona, An Identity System for the Web mozilla.org
201 points by ojr  3 days ago   126 comments top 25
stickfigure 2 days ago 3 replies      
My company is one of the early adopters of Persona/BrowserID. You can see our dual-auth (with Facebook) system here:


We've been live for several months now in the Real World - our userbase (amateur athletes) is primarily nontechnical. About half of our users choose Persona/BrowserID and half choose Facebook. We were initially concerned about the BID login flow (in particular, the immediate email roundtrip) but it hasn't been a problem and the UX has been refined quite a lot over the last month or two.

For a mass-consumer audience, the combined FB/Persona solution is excellent:

* Facebook unquestionably has the slickest auth experience, even eliminating the followup name/sex/bday questions. However, a significant percentage of the world (possibly > 25%) either Hates Facebook or wants to keep their Facebook account isolated. This is unlikely to change in the near future and could even get worse depending on what sleeping dogs Zuckerberg decides to kick next week. We don't have the option of alienating the FB haters and we wouldn't want to anyways.

* The Persona UX is good and rapidly getting better. BigTent integration with gmail, yahoo, hotmail will bring one-click login to those users. A native experience is being built into browser chrome. All this is coming without me having to write code. It may not be as slick as Facebook, but I like where this train is headed.

* Integration is simple compared to writing a username/password system. The API is incredibly easy to work with. Dual-auth with Facebook is a little more complicated, but a complete Persona-based auth system is a question of hours, not days.

* The fact that identity is just an email address makes it easier to integrate with existing login systems. In our system, you can log into the same account with both Facebook and Persona as long as the emails match. No, email is not a perfect identifier, but even nontechnical users understand it immediately and really - what other option is there? "What email address did I use?" is a lot better than "What weird combination of letters and numbers did I use as a login name?"

* Support on the Mozilla dev-identity list has been fantastic.

We're pretty happy. Honestly, I don't ever see myself writing another username/password login system ever again. Persona is less work for a better UX.

nnethercote 3 days ago 0 replies      
That link is to a very high level, non-technical overview, which is not a good link for HN.

Try these links, which go into much more detail, answering the "why bother" questions.




Note that BrowserID was the old working name for the project.

nostromo 2 days ago 3 replies      
One of the reasons FB and Twitter OAuth became so popular is because they solved a problem for the user (remembering passwords) and also gave the site owner a big carrot (social growth mechanics, more user data).

This seems much more one sided -- it's good for the user that doesn't use FB or Twitter but 'meh' for the website. I'm not sure we'll see fast adoption like we have for OAuth.

glassx 3 days ago 1 reply      
I believe that a native implementation has been on their plans from the start, but I haven't heard much about it lately. Here's some mockups:


There's also some work on integrating existing e-mail providers, so you can get instant identities:


This is probably the web-thing I'm most excited about today, even though the development is kinda slow. I really want to implement it as a sole identity provider on my websites some day.

EDIT: I also began writing a Safari extension that would work like the mockup above, but gave up halfway since it was too convoluted and totally insecure... maybe I should try writing a SIMBL plugin or something like that.

greggman 3 days ago 2 replies      
I know this probably fits only me but I never use my email address as id.

Every company and every account I sign up for I give a different email address. I'm not about to use a single one for my ID. Especially when they are easy to spoof (any one can send mail as you@there.com) and they are easy to spam and abuse. I had an email address that got 3000 spams a day. That's why I never give use email as ID because I want to be able to disable any email address that's giving me trouble.

So, not interested in BrowserID I think. Or maybe I didn't grok it.

drivebyacct2 3 days ago 1 reply      
I really, really like Persona. It's federated and it gives the identity provider a vast amount of control over the security protecting accounts.

Want to use SSH keys for auth? Okay. You can do that.

dendory 2 days ago 3 replies      
For most people, it goes like this: click on the Facebook login button, done. OR, click on personaID, have a new window open, enter email, enter password, go through setup, get back to site. This will be hard for most web users to adopt.
Flimm 2 days ago 2 replies      
Mozilla Persona (AKA BrowserID) is exciting stuff. However, it doesn't seem to have changed much since the last time it hit Hacker News [0].

There are websites that have chosen Persona as an authentication method. We now need to see the following two pieces implemented:

- It needs to be implemented in the browser's GUI, like in this old screenshot [1]. People will be able to see the usability and security benefits of having a standardised way to log in that's built-in into the browser. Could we at least have a Firefox extension or a nightly build of Firefox that does this?

- At least one email provider needs to be a Persona "ID provider", which would eliminate the need to create a Persona password and to click on a link in an email. My guess is that getting Gmail to support this would be slow work, why not try persuading smaller email providers like Fastmail to be the first to openly support Persona?

Fortunately, Persona does work even without these two pieces, using fall-back servers and a shim. But most of the benefits of Persona are only valid once the browser and the email provider deliberately support it.

[0] - https://news.ycombinator.com/item?id=2764824

[1] - http://i50.tinypic.com/2ptyv80.jpg

zaroth 2 days ago 1 reply      
I read a comment that the email provider can allow anyone to log into your accounts because they can sign any public key they want to say it's valid. This seems to be true from my 10 minutes reading the spec.

I know a password reset function that uses only email is basically the same level of trust in the email provider, and I'm no fan of email based password reset, but this feels even worse -- literally abdicating your security entirely into your email providers hands? Gmail is great because it's free, but I didn't join gmail with the idea of giving them the keys to my life.

Another thing I don't fully grok yet is the 'issued-by' concept. Does this mean that 'Relying Parties' need to whitelist all the secondaries they are willing to trust? How can that possibly fly?

Finally, in a native implementation, how is the keyring persisted on disk kept safe from malware extracting your private keys? If the browser can decrypt the keyring, so can malware.

Ok I lied, one more thing... Is there a password prompt when you first sit down at the native BrowserID implementation? Or does it just assume that your browser means its you sitting there?! Then of course the next question is, how do you tell your browser you are walking away (akin to logout) and is it going to expire all sessions that were tied to your identity when that happens? So much to worry about...

rubberband 3 days ago 4 replies      
So we have this, Open ID, Facebook, Twitter, Google, Microsoft Passport/Live ID, and probably like five other major players I'm forgetting. I apologize for being pessimistic, but this just doesn't seem like a solvable problem. I want it to be solvable, but players like Facebook benefit far too much for them to not push their particular version of "universal" login on everyone. Every player seems to think they are the One True Solution... But so far, it's all one big mess. I don't see it getting better any time soon.
rangibaby 3 days ago 1 reply      
Random thoughts:

• The name "Persona" is odd considering something by the same name already exists in Mozilla-land, a fact they seem to be aware of.

• I hope sites use this instead of forcing Facebook login!!

jackalope 2 days ago 0 replies      
As an email provider for multiple domains, I have a hard time seeing what this actually has to do with current email infrastructure. It seems the only time the email address is used is to support a fallback notification, in which case many of the claimed benefits of BrowserID are lost (as you've fallen back to a single centralized identity provider).

Correct me if I'm wrong, but BrowserID/Persona assumes that for a user@example.org identifier there is a corresponding https://example.org. That's not true for most of the domains I support. In many cases, example.org doesn't even resolve to to an IP address and web servers are only run on subdomains (and not all subdomains have HTTP servers). Does BrowserID/Persona support a DNS mechanism to discover the location of the required HTTPS web server, similar to an MX record?

The opposite problem is where a user has a single email account with multiple valid addresses at multiple domains, such as user@example.org, user@foo.example.org, user@bar.example.org, etc. I work for an organization approaching a million users where this is the case and isn't going to change anytime soon. Once again, you can't assume there is an identity provider running on a web server at all of these domains. Is there a DNS-based method of discovery to solve this problem?

Does BrowserID/Persona allow users to authenticate against the same system they use when accessing email? If so, why does the OpenPhoto example ask for a password? I thought the whole point was that users avoid sharing credentials with sites. That implementation is very confusing and looks like a scary phishing attack.

Finally, it's not very clear what I should expect in the way of connections from other computers, whether I'm running an identity provider or not. This is crucial information to avoid tripping an intrusion detection system (IDS) during unexpected connections, especially from my own users.

qznc 2 days ago 1 reply      
I tried the OpenPhoto example. One thing that introduces friction compared to username-password: The first time, I have to create a Persona account. Unfortunatelly, I'm not logged in afterwards.

Most sites nowadays log you in right after account creation and just wait for email-verification later. Is that even possible with Persona?

wmf 3 days ago 2 replies      
Now if we can just get Chrome to agree on the same protocol.
SCdF 2 days ago 0 replies      
One thing that I don't get about BrowserID is how it's supposed to work on shared systems / web cafes. There doesn't seem to be any obvious way to log out. The best you can do is 'log me out after an hour', which is crap.
Lockyy 2 days ago 0 replies      
I've been sitting around thinking about implementing a user system on my current (and first) website to try and keep people returning to the site to see people responding to their stories etc.

Now I'm wondering whether to use Persona or build it myself from scratch.

hinting 3 days ago 7 replies      
Honest question: As an app builder, why would you use this instead of facebook?

Many more users are going to have Facebook logins already, and it provides social information that may be useful to your app.

(Hoping to hear answers other than the dev-centric 'I don't like facebook')

Of_Prometheus 2 days ago 2 replies      
Is there a list of all the sites, besides Open Photo, that have already enabled BrowserID?
znq 2 days ago 1 reply      
What about native mobile apps? Is it possible to easily use Browser ID for native iOS and Android apps? Or is it just too much of a hassle getting the information out of the web view?
sebastianmck 2 days ago 0 replies      
Was picking the name Persona such a good idea when it's so close to Personas, another Mozilla product?
allenap 2 days ago 0 replies      
First reaction: the bland beige looks terrible (maybe that's because I'm not American; I always notice that 1/2 of everything is beige when I'm in the US). Combine that with the name and I'm reminded of the Persona contraceptive device.
carlesfe 2 days ago 1 reply      
Okay, I have read the specs, the FAQ, the diagrams, and tried some examples, but I can't really understand how it works.

Could somebody do a quick summary for me? I would appreciate it very much.

andyfleming 2 days ago 1 reply      
five_star 2 days ago 0 replies      
Though this is something helpful, I find Mozilla a little slower than google chrome.
rtyrtyrtyrty 2 days ago 1 reply      


This buq with webfonts are reporter year ego and stili NGASF on Windows xp Firefox 13.01 its start on Firefox 4

Kate Middleton's Wedding Gown Demonstrates Wikipedia's Woman Problem slate.com
183 points by soupboy  22 hours ago   165 comments top 20
kijin 21 hours ago  replies      
Every time a "problem" like this makes the news, the real problem always seems to be overzealous deletionists with their ridiculously strict notability requirement. Gender imbalance might be a problem, but it's not a problem to the same extent as notability-based deletionism is. Notability is an extremely vague standard, a perfect recipe for abuse and selective enforcement. A fair and efficient editorial process should strive to replace vague rules with clearer counterparts whenever possible.

Honestly, I cannot think of a good reason to delete any article at all, unless it's obviously fraudulent, marketing-oriented, illegal, or obscene according to a widely accepted definition of obscenity. All of these standards can be applied fairly strictly, and with much less vagueness than notability.

- It's not like Wikipedia is short of disk space to store a few million extra text articles.

- The argument that it would be too difficult to maintain lots of extra articles is also weak, because not every article needs to be regularly edited, and more articles on niche topics might actually attract more editors.

- No, we won't end up with a page for every John Doe and his cat. That's just alarmism. Besides, if something like that ever becomes a problem, a better response would be a prohibition on self-promotion or some other clear guideline, rather than a vague requirement of notability.

- If these deletionists are just being OCD and wanting everything to be tidy and clean and under their editorial control, I would say that they need to take a break. In fact, it's possible that people with certain psychological traits self-select for Wikipedia editorship. But the kind of intolerance and self-centered narrow-mindedness that overzealous deletionists exhibit doesn't suit the spirit of a collaborative online project. Keep your OCD to your own home/office and away from public spaces, thank you very much.

Right now, I get the impression that it's too easy to flag something for deletion and too difficult to counter the deletionist argument, especially since the deletionists are so familiar with editorial procedures. This inequality needs to change. The burden of proof should be on people who want to remove information from the Web, not on those who want to keep it. Isn't that the same principle that we fought tooth and nail to uphold against the onslaught of SOPA, ACTA, etc?

tokenadult 8 hours ago 1 reply      
I'm a Wikipedian, with a registered Wikipedia user account and moderate Wikipedia editing experience since 2010. The interesting discussion thread groups together two kinds of issues: issues discused in the submitted article from Slate and issues brought up by Hacker News participants. I'll discuss each in turn.

The Slate article by Torie Bosch,


a professional journalist who edits a project covering technology and society issues, reports from this year's Wikimania meeting that Wikipedia continues to face criticism from readers who think its group of editors ("Wikipedians") skew too heavily to "geeks" and result in underrepresentation of topics of interest to women. Thus far Wikipedia is still working on plans to encourage more women to become Wikipedians and to edit more regularly.

She finishes up by writing, "I've never been a Wikipedia editor. The community struck me as uninviting, legalistic." I'll be interested in her experiences if she decides to wade in. Unlike most Wikipedians, Torie Bosch has actual professional editing experience, having had to submit manuscripts to editors who chop out her darling words, and having had to chop out words from the manuscripts of other reporters. Most Wikipedians have not had professional editing or research experience of any kind before joining Wikipedia, and what I find most "uninviting" about Wikipedia is not that it is "legalistic" (although it often is legalistic) but that many Wikipedians are completely clueless about what a good source looks like and how bad many of the current articles have been for how long. I'm not sure yet if Wikipedia is pursuing a successful strategy to improve content quality.


After being very involved in Wikipedia editing just as there was a major Arbitration Committee case on topics that I have researched thoroughly for years,


I have reduced my involvement mostly to "wikignome" editing of random mistakes I encounter as I use Wikipedia as a reader. I still have the SOFIXIT mentality,


of cleaning up problems in Wikipedia as I find them, but to fix big problems on Wikipedia caused by point-of-view-pushing propagandists is even more work than editing a publication as an occupation (something I have done), and yet unpaid. So I really wonder how much time Torie Bosch will devote to Wikipedia when she could be doing editorial work in an actual collegial environment at Slate with pay and professional recognition.

The Hacker News comments before this comment have mostly referred to the issue of "deletionism." For example,

Every time a "problem" like this makes the news, the real problem always seems to be overzealous deletionists with their ridiculously strict notability requirement. . . .

Honestly, I cannot think of a good reason to delete any article at all, unless it's obviously fraudulent, marketing-oriented, illegal, or obscene according to a widely accepted definition of obscenity.

I wonder if there is an organized campaign to fix the overzealous deletion problem (by changing the "notability" policy), to boycott as long as it remains and pledge to donate if it is changed to a more objective policy.

Why are any articles deleted, unless they are factually wrong? Censorship. Who is to say what will be important in the future? Censorship. Who is to say that people will want to read? Censorship.

I have noticed alot of information/articles upon wikipedia get deleted/flagged for deletion at a rather zelous rate and in that I have one question: WHY, if they are not superceeded or and made redundant then personaly I feel they should never be removed.

The one-word reply to comments like these is "Deletionpedia."


I was just browsing random pages of Deletionpedia to see what was posted there before the Deletionpedia project fizzled out (which appears to have been back in 2009). These are by no means the worst examples of material that has been deleted from Wikipedia (I'm not sure if Deletionpedia was ever an exhaustive list of deleted articles, or only a selected sample of those), but the sheer lack of maintenance of Deletionpedia over the last few years calls baloney on the idea that there are lots of readers happy to read stuff that has been deleted from Wikipedia. As bad as Wikipedia often is, EDITING (modifying and deleting) stuff on it so that Wikipedia more closely resembles an encyclopedia makes some Wikipedia pages much better reads than many of the millions of pages would turn up in a keywork search on the same topics.

I don't believe that a lot of readers see value in an online "encyclopedia" with a no-deletion or hardly-any-deletion policy because no one has put up the money to fund one, and I'm not aware of anyone here on Hacker News who is donating programming skill to start one. If you really think articles "should never be removed," build a service to host articles written by anyone about anything and see what happens.

The big problem on Wikipedia is not deletionism. It is insertion of promotional articles (some more subtle than others), propaganda articles (likewise), personal or family vanity articles (very numerous), and fan and hobby articles that are not based on any reliable sources and are written in a manner more suitable for MySpace than for any encyclopedia.

A lot of people who attempt to edit Wikipedia never look up the article about what Wikipedia is not,


and attempt to publish their own thoughts, promote their own causes or businesses, social network in an online encyclopedia, self-report the news, or otherwise post material that has nothing to do with maintaining a free online encyclopedia built from reliable sources.

haberman 21 hours ago 4 replies      
I am frequently surprised to find Wikipedia articles about very narrow, highly-specialized technical concepts. For example, there is a Wikipedia page about iso646.h in the C language standard (http://en.wikipedia.org/wiki/Iso646.h). If you're wondering why you never heard of iso646.h before, it's because it is only useful if you are programming in C on a system that does not support cutting-edge characters like "&" or "|". This is esoterica that does not have any measurable impact on the world. And this isn't even an article about ISO 646 per se, it's about a header file.

I find such articles extremely useful, so I'm not advocating that they be deleted. But surely the wedding gown worn by a British Monarch who is widely known for her fashion sense is at least as relevant to the world at large.

w1ntermute 21 hours ago  replies      
This brings up a very interesting point - can anyone think of topics that Wikipedia has inadvertently omitted thanks to the narrow section of society that contributes significant amounts of new content to the site?

There's been a lot of talk in recent years about how the "initial work" of adding information to Wikipedia is mostly done, and that from here on out, it's going to be mainly about adding new content as it's created (new events, people, companies, etc.). But it seems possible that myopia on the part of editors could be having inadvertent effects.

konstruktor 17 hours ago 2 replies      
Articles like this make me angry. Wikipedia is one of the most important, if not the most important, example of digital commons, created by an enourmous amount of volunteer work. 91% of those doing this work are male, according to the article. Now a group that has contributed only a small fraction of the work for a decade complains about being underrepresented? This is not the military, which had actual barriers of entry for women, this is Wikipedia, which you could edit without an account only a few years ago. At that time it was a true, anonymous meritocracy, as opposed to a mailing list where your name gives away your gender and may subject you to gender discrimination.
conradfr 19 hours ago 1 reply      
Some years ago I cleaned up some random biology article vandalized with lots of not-even-subtle profanity. It was immediately reverted by some editor and I had to insist that he reads my modification to get it approved.

It was the last time I tried to edit anything in WP, as I always had this kind of problem.

I read it a lot like anyone else and donate small money every year but each time I read about the behind the scene, I'm appalled.

aw3c2 21 hours ago 5 replies      
Isn't it sexism to say that linux distros are for men and wedding gowns for women?
alan_cx 15 hours ago 2 replies      
Why are any articles deleted, unless they are factually wrong? Censorship.
Who is to say what will be important in the future? Censorship.
Who is to say that people will want to read? Censorship.

There should not be any notion of importance. All knowledge is important. What I find important is as valid as what any one else values as important.

Frankly, and to my shame this is the first time I have given any thought to it, I am disgusted that something which, IMHO, is supposed to be an unbiased information repository actually deletes knowledge. To me, this is the most disturbing case of censorship I have ever thought about. Government censorship is expected, bad news, sure but expected. But this is supposed to be above that. How can they bleat on about SOPA etc, then allow a small number of geeks to tell me I can't see an article about some princesses dress? Wikipedia is NOT Geekpedia. And it should not be censoring knowledge.

Quite sad actually. My Wikipedia love bubble just burst. :(

lien 21 hours ago 2 replies      
Wikipedia is an interesting place. I am not sure what classifies articles for deletion, but it feels like only people who live in the Internet industry take part in actively running and editing the articles. I don't think of it as a male dominant community (even though I am female), but it feels like to me that only for people who are only care about stuff they know. It feels more like a closed minded community than having a woman problem.

My previous company was a chip and Wi-Fi module startup (ZeroG Wireless). I had requested our company name to be created around 2009. At that point, we had been around for 4 years and we had taken $30m in funding. However, we were never granted a section on Wikipedia.

On the other hand, plenty of internet companies who were around than launch much shorter than that and their names are currently on Wikipedia, for example, Pownce. I am sure that many others are granted a Wikipedia entry for being around less, accomplishing less than what we did. The only difference is that these were Internet startups and ZeroG was not.

Some people simply need to wake up and stop living in their own bubble. Let's hope that one day they can realize that others do care about things that the community doesn't care.

nsns 21 hours ago 1 reply      
Having an article marked for deletion means nothing, it happens all the time, and is one of the ways power flows in Wikipedia's anarchic domain (and I'm not sure this kind of power play is really gendered, it is more of a conservative, restraining, force).

While I highly regard Wikipedia's amazing and quite successful project, and hope there will be more editors that are female (oriented, not necessarily biological), there is still a lot that's not there, perhaps will never be, yet matters for various cultures and localities around the world, Wikipedia English has a built in bias (hint: English), and it's not a gendered one.

Zenst 21 hours ago 0 replies      
I have noticed alot of information/articles upon wikipedia get deleted/flagged for deletion at a rather zelous rate and in that I have one question: WHY, if they are not superceeded or and made redundant then personaly I feel they should never be removed. This may be some form of gender bias, though I have not looked up male sports items upon wiki. The dress does have a nice backstory from what I read in the news and was a pretty big historical event as far as wedding go and in that factualy it will not be superceeded in that regard.

So the question is realy for me is wikipedia a source of knowledge/history/facts or is it biased towards flavour of the month(FOTM) and if it is the later then perhaps they need to revaluate there priorities.

As to a solution, maybe they could have all articles that are flagged for deletion, need approval by one male and one female. Though in that I will say that some males have a female mindset and some females have a male mindset and in that they should be able to express and vote based upon there mindset as apposed to some physical gender.

There is no golden solution though I do feel the zelous removal of articles be curtailed and a approach of only removing non-factualy/incorrect articles be the approach taken and in that, there will be less issues and references to items that have been removed.

I hope they resolve a amicle approach or I fear they will only spawn a womenpedia based site for women only and that would be a sad day and a true wakeup call to the insanity and directions being taken. Look at business for examples how issolation impacts - you will see many females who state they promote women in business and yet you never see a man saying I promote males in business. One is accepted and the other is sexist. But sadly they mearly add fuel to the issue and instead of addressing the issue, I personaly feel they exacerbate the issue. Though if you were or felt persecuted - you to would stand up and do something about it, if you were strong person. Not all people are strong in defending there morals and fairness and in that it does highlight women are just as right to feel persecuted. Though I do wish the approach of "Supporting fairness in buisness" was adopted as apposed to "Supporting women in buisness" was the standard they promoted as it is just that - about fairness and that can and does work both ways on many level.

Much respect to Mr Wales for spotting this issue and taking onboard, a true sign of a fair person.

I also have no interests in make-up and wedding dresses and the like, but I fully respect they are facts of the World and in that have as much right as any linux distro to be there, I'm not forced to look up those articles, nor am I forced to read linux distros on the site but having that option is something I compeletly and utterly support and to do otherwise would be unfair and that is something that I feel uncomfortable with and hopefully this clear and documented bias can be eliminated in any form it takes in life, be it race, sex, orientation or origins. We are all humans and in that we strive to be better every generation we spawn. Humanity is wonderful when it works and aporant when it fails. Lets stand up and count everybody.

mkramlich 18 hours ago 1 reply      
I see arguments for and against allowing Middleton's dress to be covered. For: it meets the criteria of being a widely covered and notable event, documented by numerous major media entities, and associated with a historical event of the ruling royal party of England. Against: yes it's just ephemeral celebrity gossip chit-chat that gives certain people (most likely, mostly women or perhaps some gay men) a feeling of warm fuzzies when thinking about it or fantasizing about it. So yes that feels wrong for Wikipedia because there are plenty of other websites and mediums for that sort of thing.

In other words, the problem is that it both belongs, and doesn't belong. And they need to resolve that paradox, maybe setting a new precedent or revising their official criteria.

I think the "not enough women" thing is just a side issue. And one that has an easy and blatantly obvious solution: if you're a woman and you want to become a Wikipedia contributor or moderator, then go do it. If enough of you do it, then the gender balance will shift notably. If enough of you are not interested, then it won't. There's nothing inherently wrong with either state of affairs, it would be just the way it is. For example, I don't think it's "wrong" that the overwhelming majority (99.8%+) of hair cut folks at Great Clips over the years, in my direct experience, have been women, because that probably just reflects the natural level of interest of men and women in working in that role. I don't feel oppressed or excluded. If I wanted to work there cutting hair, or have a man cut my hair, I'd make it happen, end of story, and if not, or either way, I'd live with it and move on.

zerostar07 11 hours ago 0 replies      
I think the problem with the Wedding Gown has more to do with the fact that there is no UK wikipedia, rather than a gender issue. The English wikipedia is an interesting amalgam with lots of peculiarities when seen from an outsider's perspective.
pnathan 12 hours ago 0 replies      
You can get a iPhone app subscription to Britannica for a few $/mo. I have found Britannica to be excruciatingly better written and more comprehensive on their in-depth articles than Wikipedia. Things that aren't in the nerd enclave are covered, and well. I would never donate to Wikipedia as it is today: I happily send money to Britannica.

There are too many structural problems with Wikipedia - documented over the last few years by various angry bloggers - for me to feel OK with Wikipedia. Some of the content - good. The community & rules - blech. c2 is a better wiki. :-)

nsns 17 hours ago 0 replies      
Wikipedia's stated aim, to make information "freely" available, has no gender bias; it doesn't ask for it before serving its content - in marked contrast to other services (Google? Facebook? certainly the various ad services) which might structure their content according to a user's gender.

However, if Wikipedia has another aim - to make the scope of its content bias-free, then I think it has not thought it thoroughly yet: even structuring information as encyclopedia entries is inherently biased and restrictive (not necessarily bad though). Correlating Wikipedia's contributors' sex to an assumed gender bias in its scope (who gets to decide the articles' 'gender'?), as Wales does, proves how naïve such a project currently is.

Codhisattva 9 hours ago 0 replies      
Along the same lines as the dress issue is this piece from On The Media http://www.onthemedia.org/2012/mar/09/professor-versus-wikip.... It goes into depth about the editorial problem at Wikipedia.
sbmassey 16 hours ago 1 reply      
Why not simply accept that wikipedia has a particular bias, and set up your own wiki - fashionpedia perhaps - if you want to create articles that wikipedia doesn't accept?

Wikipedia has become a success because of its culture. It should be very careful about changing that based on the demands of the entitled multitudes.

vacri 18 hours ago 1 reply      
Because all women love dresses, just like all geeks love linux.
naveen99 14 hours ago 1 reply      
isn't deletionism irrelevant in a post github world. anyone can just create a gist instead. i hope github forks the wikipedia.
Facebook Didn't Kill Digg, Reddit Did forbes.com
182 points by boh  1 day ago   115 comments top 41
cletus 1 day ago 7 replies      
None of these things killed Digg. Digg was simply a transitional meme.

Prior to the Internet you had mass media controlling distribution. The Internet comes along and you have things like Usenet and the proto-Web.

Then comes the first crowdsourcing sites to allow people to find content without employing people to curate that information. Slashdot was certainly early in this trend.

What Digg allowed is a certain band of people to control the information flow. People would get paid to promote submissions as it became clear that a front-page submission generated a lot of pageviews.

But what became apparent with all these community sites (and this includes forums) is they start with an early group who provide value to each other. This group ends up becoming insular. De facto standards form. But even in the Usenet days you had the "September" problem (where new college freshmen would get Internet access and not understand the "rules" and conventions that were in place and would ask questions that had already been answered, etc).

Basically all these social sites get worse over time as the masses flood in.

Digg died because the idea that there is a central source for news was a holdover idea from the old media days. Reddit understood this. Global reddit is basically useless. The subreddits are the only remotely interesting thing about reddit.

People complain about how HN is getting worse. That's probably true and it is true (and will continue to be true) of any such social site in the future.

I've heard the same complaints about Twitter.

Facebook for most people is not a source of news. It doesn't have the same link-sharing mindshare (IMHO) for most people that other mediums have. Ultimately I think the biggest use case for Facebook is still sharing photos. People go to Facebook to find out what their friends are doing. Very few go to find out what's going on in the world (much as Facebook would like that to be the case).

I'm surprised at how some wax lyrical on how amazing reddit is. It's just a minor tweak on a long trend of existing prior art (the subreddits). Personally I think it's a cesspool full of trolls. Proggit (programming.reddit.com) is (IMHO) just awful.

jgrahamc 1 day ago 1 reply      
And that happened, IMHO, because the people behind reddit didn't behave like jerks and the people behind Digg did.

One example of being jerks: http://blog.jgc.org/2006/07/sense-of-humor-failure-at-digg.h...

After that I was unbanned, but not after an employee of Digg defamed me in a blog post by making false claims: http://blog.jgc.org/2006/07/unbanned-from-digg.html

And here's an example of how reddit people weren't jerks when I inadvertently brought the site great slowness: http://blog.jgc.org/2010/09/tale-of-two-cultures.html

benihana 1 day ago 9 replies      
Digg did not kill Digg. Nor did Reddit, nor Facebook. People using Digg killed it. The same people are now killing Reddit, making it little more than a place to regurgitate memes, nerd pop culture, and rage comics. Reddit and Digg both used to be places to go for interesting discussion based off decent links. Over time, good and insightful discussion gave way to quick-to-consume, funny one-liners. You can see it happening in popular subreddits. Recently the moderators of /r/science started taking a heavy hand and deleting irrelevant comments.

This isn't a problem with the sites, it's a problem with the users of these sites. This is democracy in action. It's cable news. People voting don't want to see challenging, thought-provoking content. They want to see things that confirm their biases, or things they can repost to facebook for quick laughs from their friends.

programminggeek 1 day ago 2 replies      
The value in Digg was the tribe it created, not the software, not the ads, not the human beings keeping the lights on. It was the tribe of people who found interesting news on the internet and shared it.

FB, Reddit, HN did not "kill" Digg. The tribe growing up and or moving on to other things ended Digg. A similar tribe is at Reddit now, but a similar tribe used to live at Slashdot. Before that they probably lived on Usenet message boards or wherever.

In college, the comp sci program I was in had a private message board that ended up having a very similar vibe to digg/reddit/slashdot. Tech heavy, at times very heated political and religious debate. Eventually the original group graduated and it's never been quite the same.

It seems that each generation has something like digg, slashdot, reddit, whatever... that is "the thing" for hanging out and sharing/complaining about the news of the day or whatever is interesting. They might look like fads because on the internet they peak and crumble pretty quick, but really it is probably a natural cycle that communities and tribes go through.

Eventually HN and Reddit will become irrelevant to certain groups and the tribes that live there will move on to the next thing, whatever that might be.

I'm guessing the next site like this already exists or is going to be built soon, so any guesses as to what it will be if it's already out there?

rickmb 1 day ago 1 reply      
Nobody killed Digg. Digg committed suicide by telling it's original, loyal user base to go fuck themselves in it's quest for more money and a broader audience. Digg became too greedy.

But it was pretty obvious much earlier on that Digg had zero respect for its users. In many ways, Digg had a very old school broadcast attitude: the users were merely part of the product, only the advertisers mattered.

kn0thing 1 day ago 0 replies      
Here's the blog post TechCrunch flamed me about (called me a liar and whatnot). I wrote it after I tried the alpha of diggv4, but before it was released to the public; I have no idea what happened internally, but the resulting product was indeed pretty devastating -- the final self-inflicted wound after years of encouraging power users and sacrificing the best interests of the userbase: http://alexisohanian.com/an-open-letter-to-kevin-rose
rprasad 1 day ago 4 replies      
Reddit did not kill Digg. Digg killed Digg. Reddit's big break was Digg v4, when most of the community finally jumped ship.
yaix 21 hours ago 0 replies      
Reddit Didn't Kill Digg, Digg Did

I read a bit of the discussion on Reddit, and there were surprisingly (to me) many people that had used Digg before. Then something called "v4" came and Digg became unusable to many people. As I understood the discussion. Digg didn't care. So its users looked for alternatives and moved to Reddit. Digg still didn't care. People got used to Reddit and stayed. Digg still didn't care.

Yesterday I looked at Digg.com for the first time in years. Only a quick look and I spotted already a number of beginners mistakes only in the front page design.

For example the clickable headlines to the stories: they are the main content. Yet they are very /very/ light and hard to read on the white background. WTF. You want your main content to have /good contrast/ and stand out, and secondary stuff (like "points" or "who submitted" or "vote buttons") to have less contrasts as to not distract from your main content. Yet, even such basic things Digg gets wrong. I didn't dare to digg further.

grandalf 1 day ago 0 replies      
Actually Digg killed Digg with the redesign. One time after the redesign I went to the site and there was NO CONTENT anywhere. It wanted me to make friends with people before I would get to see any content.

I logged in. Still no content. That was the last time I visited digg.com, and I'm sure I'm not alone.

My guess is that if the acquirers just rever to the pre-redesign version Digg will come back to life.

officemonkey 1 day ago 0 replies      
The tipping point, IMHO, was when Digg banned people from typing the following number: "09 F9 11 02 9D 74 E3 5B D8 41 56 C5 63 56 88 C0."

As soon as that happened, Digg lost the alpha nerds. The rest was destiny played out over time.

butterfi 1 day ago 3 replies      
I was struck by the authors comment "Well, they haven't redesigned since what appears to be 1997, which has pleased their user base who like the simple look."

How long before web designers start complaining about reddit the way they complain about craigslist's design?

ajays 1 day ago 0 replies      
IMHO, Digg killed Digg.

In any such ecosystem where the inmates are running the asylum, pissing them off is not a good course of action.

The tribe knew that it was the reason why Digg was what Digg was. But arrogance on the part of the upper management at Digg was its downfall; they couldn't come to terms with the basic fact that the people were responsible for Digg's success. So they decided to tweak it, "enhance" it, modify it to "Digg 2.0", and the people revolted.

Lesson: if you are a user-driven site, listen to them. Don't piss them off.

r00fus 1 day ago 0 replies      
My take - Digg killed itself. Kevin Rose either sold out to advertisers or had a tin ear for his site's community.
binarycrusader 1 day ago 0 replies      
No, sorry, as someone that used Digg regularly on a daily basis, Digg killed Digg.

The day that Digg changed their interface was the day they lost a huge portion of their users, including me. I went back once more after that, and then never again.

debacle 1 day ago 0 replies      
I would place almost all of the blame of Digg dying on Digg itself.
ojbyrne 1 day ago 0 replies      
In my opinion (and I was there), the never ending side projects killed Digg. Revision3 (especially), Pownce, Wefollow, etc, etc.
hybrid11 1 day ago 0 replies      
I used to be a huge Digg fan, and checked it religiously. Digg v4 is what completely drove me away.

The Digg v4 idea was good, but poorly executed. They shouldn't have suddenly forced you to follow other people to get your news. They were trying too hard to become a social network like Twitter / Facebook. Instead, I think they should have integrated with Twitter / Facebook to find top news, rather than starting their own social network.

cliveholloway 21 hours ago 0 replies      
DIGG killed Digg. The week after V 3.0 launched was a mess. Tons of corporate sponsored posts (AKA intrusive adverts), often multiple submissions by the same person on the front page at the same time, and other weirdness.

Reddit was right place, right time to pick up the exodus. I don't think it did anything to kill Digg.

WestCoastJustin 1 day ago 0 replies      
Personally, I feel Digg killed Digg. They were their own worst enemy and drove people to alternatives.
toddhd 1 day ago 0 replies      
As an ex-Digger, I can tell you that it was Digg that killed Digg for me. At one point in time, Digg was cool. Granted, it was still mostly a rehash of news from Reddit, 4Chan and other sites, but the audience base was large enough to provide original content as well, and the UI was considerably better than any of the other sites.

What killed it (for me anyway) was that Digg suddenly allowed advertisers to start posting away. Ads popped up everywhere, and every other post from directly from Mashable. Digg was no longer cool, and mostly, it was Mashable's alternative site. :)

I switched to Reddit. Reddit didn't like all the Digg users migrating over initially, but attitudes have cooled over time. I can't really see a move that Digg could make at this point that would entice me back.

crisnoble 1 day ago 0 replies      
I am surprised no one has liked to ncomment's rendition of the how the digg vs reddit war went down.

Part 1: http://ncomment.com/blog/2009/04/08/war-13/

Part 2: http://ncomment.com/blog/2009/12/17/war-23/

Part 3: http://ncomment.com/blog/2012/01/06/war-33/

tibbon 1 day ago 0 replies      
One thing I never understood about Digg is why at one point they had around 300 employees and an incredible burn rate- only do less successfully and less efficiently do what Reddit was able to do with 5-10 employees.
unreal37 1 day ago 0 replies      
I think businesses face significant hurdles (and are likely to die) when the founders lose interest in running them. Digg. Twitter (Dorsey left but came back and saved it). Yahoo. Flickr. MySpace. The list is endless.

Even Microsoft is not the same since Bill Gates handed the reigns to Ballmer. They're just too big (Office/Windows cash cows) to actually die from that.

Founder disinterest is lethal.

zmmmmm 23 hours ago 0 replies      
Reddit is a bit like the craigslist of news aggregators. They've resisted the lure of monetizing in any big way and continue resolutely to put their users first. Digg got overambitious and sold out and their users smelled it and left.
gubatron 16 hours ago 0 replies      
no, Digg killed digg.
They didn't listen to the community and they lost user data and did nothing to recover it, at least that was my experience after loosing several thousand diggs on my user account and then being reset to 0. You can't imagine the piss off.
marcamillion 1 day ago 0 replies      
Facebook, Twitter, Reddit didn't kill Digg. Digg killed Digg.

Kevin got caught in a situation where he had to please investors and they were looking at what other people were doing and he stop innovating and started copying.

That's what killed Digg. Nothing else.

Kaedon 1 day ago 1 reply      
I left Digg for Reddit because Reddit's community at the time valued articles with content over images. As Reddit has absorbed Digg's old user base, so too did it absorb much of the culture of Digg. Reddit no longer seems to have much focus on reading and has instead become a place to find funny images, much like Digg in its heyday. It feels inevitable that the culture of a website like Reddit will change over its life span but it bothered me to see it happen first-hand.
mirceagoia 1 day ago 0 replies      
Digg was killed NOT by Facebook OR Reddit...but by its power users. Those killed Digg.
paulhauggis 1 day ago 0 replies      
I loved digg. When it was first out, I was able to get roughly 70,000 people to my website per week by just posting an article and having my friends digg it.
myakimov 1 day ago 0 replies      
I was just thinking about this yesterday when I read the Digg article.
I used to read Digg all the time, until it got to the point where the comments in each Digg submitted article stated how that article/news story was on reddit first. This is when I found out about reddit... haha
wastedbrains 1 day ago 0 replies      
I would say diggs attempts at manipulating the front page and banning people, blocking posts, and unfairly pushing some content to the front helped kill it. When I lost faith in the democracy of Digg I just stopped going and spent time with my votes elsewhere (reddit)
antithesis 14 hours ago 0 replies      
> What did Reddit do right that Digg did wrong? Well, they haven't redesigned since what appears to be 1997, which has pleased their user base who like the simple look.

Reddit was founded in 1997?

dan_yall 1 day ago 0 replies      
Maybe I'm the only one, but I moved from digg to reddit entirely because digg was blocked at work and reddit wasn't. I've worked at four different places in the last five years and at each one I expect to see reddit blocked as well, but no one has gotten around to it. Maybe my situation is atypical, but I have to believe that the lack of availability of digg to office-bound slackers had to have played some part in it's demise.
Vivtek 1 day ago 0 replies      
I'm pretty sure it's Yahoo! that pulled the trigger.
conductr 1 day ago 0 replies      
As I remember it, Digg forced some updates on it's users that they (we) did not like and were very vocal about. Facebook does the same thing, except there were other options for Digg users (reddit), Facebook users have a steep cost of moving to another service - I would actually argue Facebook users have no other options
asdfasdghasdf 17 hours ago 0 replies      
This article will be useful for the zero people who think Facebook killed Digg.
velodrome 14 hours ago 0 replies      
Digg v4 killed Digg. Reddit filled the void. End of story.
krsunny 19 hours ago 0 replies      
This is news?
natmaster 1 day ago 0 replies      
Digg actually killed themselves. Reddit was just there to reap the rewards.
vijayanands 1 day ago 0 replies      
Actually, IMO Hacker News killed Digg. All the news here would have been in Digg otherwise :)
Toshio 1 day ago 2 replies      
I also used to read Digg all the time, but over time it became a cesspool of pro-microsoft trolling, and I left in disgust. Imagine being attacked every time you expressed legitimate disappointment with vista. I expect we are going to witness something similar this fall on reddit (and even here) when that new polished turd gets released.
Vim, you complete me thoughtbot.com
179 points by Croaky  2 days ago   112 comments top 18
Rudism 2 days ago 7 replies      
It all started out innocently enough. You experimented with it once or twice in your first year of college, but Nano and Pico were easier"closer to what you had already been using during high school on the Windows machines and Macs. But as time went on and you got more experience under your belt in the college-level computer science courses, you started to notice something: All of the really great programmers"the kind who churned out 4 line solutions for an assignment that took you 10 pages of code to complete; the kind who produced ridiculously over-featured class projects in a day while you struggled with just the basics for weeks"none of them used Nano or Pico.

Staying late one night to finish an assignment that was due at midnight, you happened to catch a glimpse over one of the quiet uber-programmer's shoulders. Your eyes twinkled from the glow of rows upon rows of monitors in the darkened computer lab as you witnessed in awe the impossible patterns of code and text manipulation that flashed across the screen.

"How did you do that?" you asked, incredulous.

The pithy, monosyllabic answer uttered in response changed your life forever: "Vim."

At first you were frustrated a lot, and far less productive. Your browser history was essentially a full index to the online Vim documentation; your Nano and Pico-using friends thought you were insane; your Emacs using friends begged you to change your mind; you paid actual money for a laminated copy of a Vim cheat sheet for easy reference. Even after weeks of training, you still kept reaching for your mouse out of habit, then stopped with the realization that you'll have to hit the web yet again to learn the proper way to perform some mundane task that you never even had to think about before.

But as time went on, you struggled less and less. You aren't sure when it happened, but Vim stopped being a hindrance. Instead, it become something greater than you had anticipated. It wasn't a mere text editor with keyboard shortcuts anymore"it had become an extension of your body. Nay, an extension of your very essence as a programmer.

Editing source code alone now seemed an insufficient usage of Vim. You installed it on all of your machines at home and used it to write everything from emails to English papers. You installed a portable version along with a fine-tuned personalized .vimrc file onto a flash drive so that you could have Vim with you everywhere you went, keeping you company, comforting you, making you feel like you had a little piece of home in your pocket no matter where you were.

Vim entered every part of your online life. Unhappy with the meager offerings of ViewSourceWith, you quickly graduated to Vimperator, and then again to Pentadactyl. You used to just surf the web. Now you are the web. When you decided to write an iPhone application, the first thing you did was change XCode's default editor to MacVim. When you got a job working with .NET code, you immediately purchased a copy of ViEmu for Visual Studio (not satisfied with the offerings of its free cousin, VsVim).

Late one night, as you slaved away over your keyboard at your cubicle, working diligently to complete a project that was due the next morning, you laughed to yourself because you knew no ordinary programmer could complete the task at hand before the deadline. You recorded macros, you moved entire blocks of code with the flick of a finger, you filled dozens of registers, and you rewrote and refactored entire components without even glancing at your mouse. That's when you noticed the reflection in your monitor. A wide-eyed coworker looking over your shoulder. You paused briefly, to let him know that you were aware of his presence.

"How did you do that?" he asked, his voice filled with awe.

You smile, and prepare to utter the single word that changed your life. The word that, should your colleague choose to pursue it, will lead him down the same rabbit hole to a universe filled with infinite combinations of infinite possibilities to produce a form of hyper-efficiency previously attainable only in his wildest of dreams. He reminds you of yourself, standing in that darkened computer lab all those years ago, and you feel a tinge of excitement for him as you form the word.



cletus 2 days ago 6 replies      
Vim reminds me of an example from Bret Victor's excellent talk Inventing on Principle [1] (which you should watch from beginning to end if you've somehow been hiding under a rock and have missed it).

Bret mentions Larry Tesla (starting at about 38:10) who made it his personal mission to eliminate modes from software.

This is the problem I've always had with Vim and I suspect I'm not alone in this. I find the concepts of modes jarring, even antiquated. Everyone who has used vi(m) has copied and pasted text in while in command mode and done who knows what.

Emacs is better in this regard but I find Emacs's need to consecutive key presses (or a key press followed by a command) to be longwinded.

The advantage of either is they're easy to use over ssh+screen (or tmux) for resuming sessions.

That all being said, give me a functional IDE any day. IDEs understand the language syntax. Vim can do a reasonable job of this. Emacs (with elisp) seems to do a better job (or so it appears; I'm no expert) but IDEs (my personal favourite being IntelliJ) just make everything easier. Things as simple as left-clicking on a method and going to its definition.

For statically typed languages (eg Java/C#), IDEs (quite rightly) rule supreme. Static code analysis, auto-completion, etc are just so much better than text-based editors.

Dynamic languages are more of a mixed bag and its certainly the norm for, say, Python and Ruby programmers to use one of these.

Still, give me an IDE any day.

One objection seems to be that people don't like using the mouse. I tend to think the speed differences over not using the mouse are largely illusory.

Anyway, I can use vim but I've never felt comfortable in it and I don't think I ever will and modes are the primary reason.

EDIT: I realize there are plugins and workarounds for many of these things but that's kinda the point: I don't want to spend hours/days/weeks/years messing with my config to get it "just right".

Also, instead of archaic commands to, say, find matching opening/closing braces (which tend to involve an imperfect understanding of the language), IntelliJ just highlights matching braces/parentheses, where a variable is used and so on.

[1[: http://www.youtube.com/watch?v=PUv66718DII

solutionyogi 2 days ago 3 replies      
I have been using Vim for over 3 years and I didn't know about the whole line completion.

I think the best thing about Vim is that it will always keep surprising you. If only we could find a life partner like that, there won't be any divorces.

calinet6 2 days ago  replies      
I understand Vim. Really, I do. I even know how to use it for the most part.

But I guess I just don't get it. It's too obtuse. I don't feel connected to my editing while using it, I feel... connected to Vim. Which, I think, might explain why others feel so connected to Vim too. They get attached to it because it's an investment, and as we know from various psych studies, we get attached to things we invest in.

This isn't a bad thing, just wanted to throw my 2¢ out there. Vim's a cool language for text editing, but it's not the only one.

pdeuchler 2 days ago 7 replies      
This may just be me, but I feel that a lot of the productivity that Vim supposedly creates is a result of programmers spending lots of time learning one tool very well, as opposed to something inherent within Vim that automatically makes you more productive.

That's not to say Vim doesn't have an excellent ecosystem, and has tried and true ergonomic benefits, however I feel that if someone spent the same amount of time learning, say TextMate (just an example, may not be the best), they would be just as productive.

pbiggar 1 day ago 0 replies      
I used vim for about 10 years before I finally gave up, and it was completion that did it for me. It was just so ludicrously hard to customize vim (writing vimscript - although the python integration wasn't much better) that it hurt. I was held back by not being able to customize it, and the fact that I couldn't get any better completion than ctags was insane.

My startup Circle is written in Clojure, so I was pretty much forced to learn Emacs, and the world is such a nicer place. I really wish I had learned Emacs 5 years ago - with the time I spent mastering and then trying to customize vim, I could have mastered and learned to properly customize emacs.

exDM69 1 day ago 0 replies      
Nice overview of vim's completion features, but...

This article is very incomplete, there are various completion modes. I also don't recommend mapping complete to tab (or even using SuperTab or other completion plugins). Vim has a total of 13 different completion modes (see :help ins-completion), all of which are useful. There's complete on local and global keywords, complete for complete lines or filenames.

In particular, this article calls Vim 7's omni complete (^X^O) "syntax aware complete" which it is not. It's a language specific completion, which can complete stuff like members of structs or functions. You need your language specific plugins to be installed. Vim ships with a decent plugin for C and C-like languages, which works if you have a ctags database generated (:help 'tags').

If you want syntax completion, you can map it to user-defined completion (^X^U) like this:
set completefunc=syntaxcomplete#Complete

It's not particularly useful as such, because syntax complete completes keywords like "for" and "while" which we all have in muscle memory.

johnchristopher 1 day ago 0 replies      
Using the article completion method it turns out that I can no longer use <tab> to insert a <tab> since it displays a list of word to be completed.

It might be due to my vim configuration but I doubt it (though I am not a vim script expert by any means).

Here is a function I found on the web a long time ago and that seems to prevent that problem:

  inoremap <Tab> <C-R>=MyTabOrComplete()<CR>

function MyTabOrComplete()
let col = col('.')-1
if !col || getline('.')[col-1] !~ '\k'
return "\<tab>"
return "\<C-N>"

No idea how it really works, I never investigated.

Credits/source: http://www.slideshare.net/andreizm/vim-for-php-programmers-p... slide 44

edit: formating and credits

JangoSteve 2 days ago 1 reply      
> For example, in HTML vim should know that you can't nest a div inside an a...

In HTML5, you can. See http://html5doctor.com/block-level-links-in-html-5/

jes5199 2 days ago 1 reply      
There's something weird about the ^P/^N menu - I always have trouble telling which possible completion is selected in the list, and I often end up picking the wrong one.
riannucci 2 days ago 0 replies      
Also, SuperTab is all of the win: https://github.com/ervandew/supertab/

It'll let you use <Tab> for all the completion types, and you can tell it the fallback order for various thing (i.e. try omni, then <C-N>, etc.). It's even slightly context aware, so it can guess a good first completion method for you.

All with <Tab> :)

baggers 1 day ago 0 replies      
Though this doesn't help with the holy wars, I do find it just fantastic that there are two editors that so many find, with an acceptable amount of playing, can be the near perfect tool for their craft.
I played with vim first but never quite got a flow going...oddly enough I hit it off almost immediately with emacs, it was still damn confusing but I made much more progress much faster. I'm almost disappointed I didn't fall for Vim as it is always on any server I have to use and emacs muscle memory has only compounded my problem with Vim shortcuts!
For those not fancying diving into vim or emacs are there any more traditional IDE's with a way to script new behaviours and have them feel native?
wiradikusuma 1 day ago 1 reply      
i'm a java IDE user (eclipse, then intellj).

in intellij, you can ctrl+w to select word, ctrl+w again to select statement, ctrl+w again to select function, etc.

you can also refactor replace all, shown if foo.gif actually exist in img src, ctrl+click to css, shown if a variable (in code/js/css) is unused, hinted if a line can be simplified, etc.

can vim do that?

if it can not, what's the compelling reason for people like me to use it? (i honestly interested, but wondering if it's worth the effort)

sirdavidoff 2 days ago 0 replies      
Moving to vim from TextMate, I found that I missed TextMate's autocompletion.

In Vim you have to specify whether the word you're looking for is above or below the cusror (using C-P or C-N). IIRC, TextMate just looks for the closest matching word, no matter where it is.

I wrote a little plugin to do make it exactly that:


Disclaimer: totally rough and not (yet) very customisable

mun2mun 2 days ago 0 replies      
Also checkout neocomplcache, https://github.com/Shougo/neocomplcache
ChuckMcM 2 days ago 0 replies      
We never saw poetry written about TECO.
gubatron 2 days ago 0 replies      
"Fuck VIM" -Richard Stallman
va_coder 2 days ago 0 replies      
I didn't know this one: ^X ^L
We (unexpectedly) got 60K users in 60 hours - What we learned patrickambron.me
177 points by jordanmessina  2 days ago   111 comments top 33
klbarry 2 days ago 3 replies      
Interesting! I made my own guide a while back for my friends, with great success.

1a) Make a "brand" with your middle name
Google your first and last name. If you're like most people on earth, you're one of many with your particular combination. So how can you rank higher?

Never fight a battle you don't have to. Pick a middle name, real or imaginary. Google your new full name.

Example: My name is Kevin Barry. The Google result is completely owned by Wikipedia and other impossible to compete against sites.

My full name is Kevin William Lord Barry. I think Lord sounds cool, so I'll make Kevin Lord Barry my “official” online name. It's much easier to rank for and even helps with personal branding.

1b) Consistency!
Put your new name on top of your resume for consistency.

2) Edit/Create Your Facebook
Take your new name. If your Facebook looks professional, change your Facebook name to your new name. If not, make sure your Facebook doesn't use your new full name.

3) Edit/Create Your LinkedIn
Take five minutes to create a LinkedIn account with your new name. Put all of your resume information on it neatly. LinkedIn will rank well for your new name, and you can brag as much as you want on it without looking pompous.

4) Make Yourself Look Good on Amazon
Make an account on Amazon, using your new branded name. Pick a couple of books in your industry with good ratings. Read the summaries (read the book, preferably, but I won't judge if you don't). Leave a review of the books that makes you look good: show that you know industry terms, talk about your experience, etc.

Each review you leave will go to your Google front page and make you look smarter. This only works if you know enough about your industry to sound smart, of course. You can also do this for textbooks, or fiction that you like if you want to sound interesting.

5) Make Accounts on Web 2.0 Websites
Take five minutes to make an account on sites that allow descriptive profiles with your full name Quora, Yahoo Answers, DisQus, Meetup, or anywhere else you want. Feel free to participate in these communities to help even more, although it's not necessary.

6) Strut Your Stuff!
Here's where you can have fun and really seem impressive. Go to Weebly.com and make a free website, called “yourfullname.weebly.com”. Set the page title to “Your Full Name Online” and the page description to “Your Full Name's Online Website”. Write a paragraph about yourself on one page, and a page with links to your linkedin, Facebook, or anywhere else you want to show people. Go nuts and add anything else you want that might make you seem interesting.

engtech 2 days ago 2 replies      
Have you any thoughts of handling the edge case of users with ridiculously common names?

In the top ten for my firstname lastname I have:

   a famous chef,

a life coach who is doing branding SEO,

someone else who is a programmer,

a professional photographer

In the top three firstname middlename lastname

   various high school kids,

a registered sex offender

patrickambron 2 days ago 1 reply      
Hey everyone, just wanted to say I'm truly flattered my blog got posted to HN and made the front page (I'm the author). I hope people found it helpful


1) Some of you wanted to know how HN compares to other sources for us, in terms of signups, conversions, etc. I'll post it in this thread at the end of the day (would you guys want a follow up post on this "Value of HN visits")

2) I've gotten some great feedback on the actual product from the HN crowd, which is awesome. Feel free to leave any feedback on this thread (you can try it at http://brandyourself.com)

eps 2 days ago 2 replies      
Patrick, can I ask if you get any actual calls in response to "call and talk to the real person" offer on your website? I'm sure many on HN recognize the importance of having a phone number on the site, but it'd be very helpful to know what it translates to in real world.
pdx 2 days ago 1 reply      
How are they determining what companies are googling you?

Do they attempt to create a top ranked page about you, and then monitor IP addresses visiting it, and match that to some db that maps company to IP address?

aresant 2 days ago 1 reply      
I love this service, was in the early signup group and within a month controlled my first 10 results.

Keep expecting a pivot to small biz though and hope that's in the plans, seems like a much easier group to monetize.

shootthemoon 2 days ago 3 replies      
There is one problem with the "find who is googling you" feature. For about a year now, google has started hiding the search results from the referral when someone uses google while logged in to google services.

You may have seen "not provided" in google analytics for searches that found your site... This is users logged in to google. And with more people using google services such as gmail and g+, more and more results will be hidden. On my corporate, tech web site, I'm seeing 40-50% of results from google as "not provided" nowadays, and its increasing every month.

joshuahedlund 2 days ago 1 reply      
Very cool, looks like it has a lot of potential. I'm hoping you have some spam prevention strategies in place, though, as it seems pretty ripe for it as it takes off (just like any free service that lets you create profiles and post links to sites, but especially a service that promotes SEO and posts followed links to sites)
gamingfiend 2 days ago 1 reply      

> Co-Founder & CEO at BrandYourself.com

> January 2009 - January 2000

Branding fail

Jordanian 2 days ago 1 reply      
"As any startup will tell you, most publications will only feature a startup if given some sort of exclusive. In their eyes, the only reason to cover a startup is if it provides readers with something they can't get anywhere else."

How do you start this conversation with a publication? How hard is it to simply cold email Mashable and attempt to get an article?

I'm assuming once the relationship is built it's easy to get follow up posts but do you have advice for making that initial connection?

GrantCov 2 days ago 1 reply      
Very interesting product. I haven't yet had a chance to fully explore the site, but I will certainly be creating an account in the future.

I have two concerns:

1) What happens when two people with the same name sign up? Couldn't that lead to issues where they're both trying to promote their own stuff while burying the others?

2) Is there any way that someone could pose as someone they're not in order to sabotage that persons google results, or I could see the possibility of a friend doing that in order to pull a prank on you?

As I said I haven't fully explored to site, so if these concerns are addressed on there, apologies. Great article!

ricardobeat 2 days ago 1 reply      
While this break-down is very interesting, it all boils down to one thing: they announced an innovative feature that interests a lot of people. Things could have played out differently but the outcome would be the same - lots of signups.

If someone from the company is reading: I found out I already have an account (probably from the time it launched), but I can't access it. Login doesn't work and when I try to reset my password it says "e-mail not found".

eli 2 days ago 1 reply      
More just FYI, but it's weird that your blog doesn't show up in https://www.google.com/search?q=We+(unexpectedly)+Got+60K+Us...
jordanmessina 2 days ago 1 reply      
Any chance you will release a plugin for a non-brandyourself site? I'd love to put the "find out who's googling you" feature on my blog or even my startups website.
spindritf 2 days ago 1 reply      
How much better are your results in boosting visibility as compared to for example getting myname.com and linking to myname.com from all social media profiles?
nickler 2 days ago 1 reply      
I remember telling people about this when it first launched, citing it as a vanity metrics booster.

As a startup founder, however, it's invaluable. I'm a converted fan, and the process even taught this rookie a bunch about link building and SEO.

Great work, keep it up, and loved the blog post.

kevinholesh 2 days ago 1 reply      
Were these high quality users in the sense that they converted to paid from a free account? What source lead to the highest quality users?
69_years_and 2 days ago 1 reply      
Just a thought - some people go to a lot of effort to 'hide' who they are by ensuring their name does not turn up - or if it does its so mixed in with many result's that it would be hard from a casual review to determine this is the John or Jane Doe one is after - while others a lot of effort to make sure a search turns up the correct Mesers Doe.
patrickambron 2 days ago 0 replies      
Thanks again everyone, I'm truly flattered. We had some great discussions (have some catch-up to do today, had no idea I'd be spending so much time on HN :)).

Since so many people were curious to see the affect HN would have on our signups, I think I'm going to do a follow up post next week.

Zenst 2 days ago 1 reply      
Good things happen to good products. Though Hashable is closing so might see a influx from users migrating from there over the next few weeks.
quanfucius 2 days ago 1 reply      
I really like how you guys built a viral mechanism into the product. You incentivize the user to share their BrandYourself with their existing profiles/networks to receive a "boost", which helps spread word of your product. Pretty ingenious if you ask me. Good job guys!
whit537 2 days ago 1 reply      
Good work, man. Takeaway for me is to make the most of Google Analytics' conversion tracking features.
newobj 2 days ago 0 replies      
Congratulations, but there's no way to say who a "user" is in 60 hours. What you got was 60k signups in 60 hours.
exim 2 days ago 0 replies      
How did you implement that particular feature? (Notify when someone searches you)
dropshopsa 1 day ago 0 replies      
I want to know how many signups you get from this submission to HN
DanielOcean 2 days ago 1 reply      
Awesome piece. It's rare that a startup dives this deep into the details. You deserve the 1000+ signups you'll score from a HackerNews front page ;-)
3amOpsGuy 2 days ago 1 reply      
I feel like I want these guys to succeed just because Patrick seems such a nice guy, have enjoyed reading the threads.
dave_chenell 2 days ago 1 reply      
30% is a awesome conversion. Any specifics on how you were able to get it up so high?
ga4 2 days ago 1 reply      
I really like your product, but the only issue I had was when I was boosting my links. After every step in the boosting process I was asked to share that step with my Social Network, while that may help my SEO it became more of a nuisance. I would like to be asked only after completing all of the boosting steps.

Other than that keep up the great work!

rozap 2 days ago 1 reply      
You might want to take the quote from Fox news off your front page. First thing I noticed...
eragnew 2 days ago 0 replies      
Thanks for sharing the lessons
brandcoachkelly 2 days ago 1 reply      
Hey Patrick, I don't see the link to the article. Did I miss something? Can you please resend, I am very curious. Thanks!
davedx 2 days ago 1 reply      
"Error establishing a database connection"

Is this a joke?

Use OSX Finder Quicklook (Spacebar) to preview all plain text files coderwall.com
177 points by rover  3 days ago   58 comments top 20
aes256 3 days ago 1 reply      
Thanks so much for this. For the longest time I've found it irritating that I couldn't preview plaintext .nfo files with Quicklook.

I didn't realise Quicklook plugins were a thing...

miles 3 days ago 1 reply      
Suspicious Package http://www.mothersruin.com/software/SuspiciousPackage/ is another neat QuickLook plugin that displays detailed information about Installer Packages (.pkg files).
Zirro 3 days ago 3 replies      
I'll take the opportunity to ask: Is there a way to make Spotlight index files in the Library folders? There are sometimes some obscure preference or cache-files that I need to find.
pstadler 3 days ago 3 replies      
Somehow off topic but this is a very neat trick:
You can preview the selected item directly in the Spotlight menu by pressing the left arrow key.

Source: http://www.macstories.net/mac/all-you-need-to-know-about-qui...

pstadler 3 days ago 1 reply      
Nice plugin!

Pro tip: Instead of restarting finder run `qlmanage -r`

rasmus_b 3 days ago 1 reply      
vito 3 days ago 2 replies      
Is there something like this for Chrome? Really annoying having code downloaded instead of displayed.
mtr 3 days ago 2 replies      
On a related note, is there any way to get Spotlight to index all text files?
zwass 2 days ago 0 replies      
File this under things that should have worked in the first place.
matthijs 3 days ago 2 replies      
Would it be possible to make the text inside the Quicklook window selectable?
I often just want to quickly copy something from a file, and Quicklook would be the fastest way to do so.
hcarvalhoalves 3 days ago 1 reply      
As always... simple hacks, the best hacks!
SmileyKeith 3 days ago 0 replies      
This is fantastic. Always wondered why quick look wasn't set up to parse plain text files just because they had a different extension. Especially Markdown.
backwardm 3 days ago 0 replies      
You can also add a QuickLook directory to your own ~/Library if you are into keeping stuff separated.
uladzislau 3 days ago 0 replies      
Is there any way to change the font or its size in Quicklook?
lorrin 3 days ago 3 replies      
Anyone know how to control what the Open with button offers? E.g. it's offering "Open with Sublime Text 2" and I'd rather have an Open with MacVim.
redog 3 days ago 1 reply      
Is there such thing as plain text?
magoon 3 days ago 1 reply      
This is so good
miggaiowski 3 days ago 0 replies      
.org files =)
205guy 3 days ago 0 replies      
So now opening a text file is "cutting edge?" [1]

[1] http://news.ycombinator.com/item?id=4226539

PS, just like clicking "More" on HN front page after reading a few articles results in a "cutting edge" error.

Petition the U.S. Government to Force the TSA to Follow the Law schneier.com
175 points by thoughtsimple  3 days ago   35 comments top 12
user49598 3 days ago 2 replies      
We all realize that there is no direct measurable effect on government from signing these petitions. And we also realize that signing these petitions and then taking no further action does not make you a hero.

But for god's sake, stop posting about how they're useless every time someone starts talking about one. First off, we get it, some people think they're painfully useless. Second off, just because you can't see any direct effect, or just because the effect wasn't exactly what you wanted it to be, doesn't make them useless. They are a good tool for rallying support behind an idea. They are a good tool for spreading awareness. They are a good tool for getting a cause a little bit of noticeably. They are a good tool for collecting thoughts in a coherent matter so they may be further discussed.

TLDR: We get it. You don't get petitions. Please figure it out or stop complaining. You're not helping a goddamn thing.

warfangle 3 days ago 5 replies      
I predict it will end up just like every other "successful" petition on whitehouse.gov:

"We hear you, but you're wrong and we aren't going to change a damn thing."

blhack 3 days ago 0 replies      
None of this matters. Look at the response we got to "Legalize and regulate marijuana in a manner similar to alcohol"


nhebb 3 days ago 1 reply      
Let's face it, the only petition that will make a difference is the big one on November 6th. I'm amazed the TSA's policies and practices have not become a campaign issue. If you care about this, then petition the candidates to take a stance for civil liberties.
true_religion 3 days ago 0 replies      
Well if you have standing, take them back into court (you know, the place where the government is actually willing to listen to you). Petitions are meaningless.
sp332 3 days ago 1 reply      
Doesn't the court itself have some power to enforce its mandates?
chx 3 days ago 1 reply      
If you want change with the TSA first figure out what a politican can say when (s)he will be attacked for being soft on terrorists cos that's what's going to be happen if anyone tries to reform the TSA.
Bruce_Adams 3 days ago 3 replies      
Attempting to login to https://petitions.whitehouse.gov/ is very slow and eventually gives an error message:

Our Apologies....

The site is currently undergoing maintenance. We appreciate your patience while we make some improvements.

Please check back soon.

kfinley 3 days ago 2 replies      
Looks like https://petitions.whitehouse.gov is using MongoDB

  Additional uncaught exception thrown while handling exception.

MongoCursorException: couldn&#039;t send command in Mongo->__construct() (line 35 of /mnt/codebase/petition-release-2012-07-11/sites/all/modules/contrib/mongodb/mongodb.module).

MongoCursorTimeoutException: cursor timed out (timeout: 30000, time left: 0:0, status: 0) in MongoCollection->findOne() (line 22 of /mnt/codebase/petition-release-2012-07-11/sites/all/modules/contrib/mongodb/mongodb_cache/mongodb_cache.inc).

Edit: The page is loading correctly now.

shashashasha 3 days ago 1 reply      
Are any of you seeing a lot of duplicate Signature #'s? http://o7.no/NhQtfu
sneak 3 days ago 0 replies      
...because the fact that they simply ignore the law presently means that a petition will make them willfully start following it.

Delusional. It's time to leave America.

J3L2404 3 days ago 0 replies      
Are there petitions for poverty and healthcare somewhere?
Report: Facebook Monitors Your Chats for Criminal Activity mashable.com
157 points by adventureful  2 days ago   157 comments top 29
DanielBMarkham 2 days ago 4 replies      
It's been clear in my mind for some time now that Facebook is desperately doing anything possible to stay plugged into our internet lives. Their attempted take-over of email, which will probably lead to some success, only reinforces this. I think they see the writing on the wall -- that newer services will take over older ones -- and are doing anything they can to stay top dog.

What we need is an abstraction layer on top of social networks. No matter what their TOS, they do not own my friends or my conversations with my friends. I have no qualms at all about having some other service handle my friendships and conversations in a way I deem appropriate.

We need to pry Facebook's greasy hands from our throats before it's too late. At one point they were cute. Then they were pleasantly time-wasting. Now they're crossing over the line firmly into evil territory.

rationalbeats 2 days ago  replies      
I'm not a criminal, I am a a pretty mundane guy actually, but of course we live in a society that every single one of us breaks some small law every day.

Which is why I stopped using Facebook.

I also stopped using Twitter to tweet. I still use it to follow news sources, I just don't actively tweet. I did that after the NYPD won a court case to see all the private messages you send on Twitter.

I also don't comment much at all on blogs, and social sites like this one or Reddit anymore. (I use to be a top 10 contributor over at Reddit. At least that is what some metric said a few years ago when someone listed the top ten most popular usernames. That account is deleted now)

I am slowly pulling out. I have a deep distrust of the current surveillance state in the United States. I remember reading a story about a guy who posted a quote from fight club on his Facebook status and a few hours later in the middle of the night the NYPD was busting in his door and he spent 3 years in legal limbo over it. (Might have been NJ police anyways, red flags)

You start piecing together these things, and you start to realize that your thoughts and ruminations about life, the universe, and the mundane, can be used against you at any moment and can completely strip you of your liberty and freedom, and any happiness you may have had.

I am gonna be completely honest, I am scared to express myself any longer on the Internet in any fashion. I don't trust it any longer. I don't trust the police, I don't trust the FBI, I don't trust the federal government, and I also don't trust, nor have faith, in the justice system in the United States.

sriramk 2 days ago 5 replies      
This actually tripped up a friend of mine a couple of years ago. She left a comment on a photo of someone holding a toy gun saying "You look like <insert-name-of-well-known-terrorist>" followed by a smiley. Within hours, she got a message and a phone call from someone claiming to be working for FB's security who asked her some basic questions on why she left that comment. The whole experience scared her from using FB for a long time.

I thought the whole thing was adhoc and confusing. Anyone who saw the comment could easily see that it was a joke. Also, if it wasn't a joke, why is FB calling her and not someone from law enforcement?

Would love it if someone from FB here on HN could comment.

stfu 2 days ago 3 replies      
What I find interesting is, that now the but think of the innocent children argument is also getting adapted by the corporate world to justify incredible privacy invasions.

Facebook's mass wiretapping and analysis of its users private communication seems almost like the post office scanning each and ever letter and postcard in the vague hope of finding some keywords related to bomb, terror and of cause "children". I wonder how long it is going to take until Google is going to send automated notifications to my local police station when I'm going to start googeling some water bomb tutorials for the summer.

zethraeus 2 days ago 1 reply      
The Mashable article seems to be sources from a Reuters article.
http://www.reuters.com/article/2012/07/12/us-usa-internet-pr... The program does appear to focus on sexual predators.

Mashable quotes Facebook as stating “where appropriate and to the extent required by law to ensure the safety of the people who use Facebook"

Can anyone speak to whether or not proactive scanning could possible be required by law? It seems entirely unlikely, but IANAL.

malandrew 2 days ago 4 replies      
I'm of the opinion that once enough people get fed up with a surveillance state, or even a surveillance society since private entities are involved, that the best way to "fix" the problem is by collectively generating noise that makes it too expensive and time consuming to find a needle in a haystack. Right now they probably generate very few false positives, however if many people went out of their way to actively generate false positives on a regular basis, you've effectively disabled such a system and manufactured reasonable doubt.

Generating deliberate false-positive inducing noise in communications deemed to be private between two or more individuals who know one another should be protected as free speech. To argue otherwise would be the equivalent of prosecuting an individual for yelling "Fire" in their own home among friends and stating that such an act is a clear and present danger to the US.

IMHO automated cooperative manufactured reasonable doubt will probably be one of the last bastions of civil liberties in a surveillance society.

crazygringo 1 day ago 0 replies      
I find this fascinating from a legal/political perspective.

Facebook is essentially using the same techniques to monitor private communications as the NSA supposedly does. This means Facebook has the power to report, for example, selected messages but not others. (I'm not saying they do, of course, just that they could be selective or discriminatory that way.)

The fact is that Facebook has taken upon itself a role similar to that of the police, but without any democratic oversight.

This is different from a bar owner overhearing a conversation about a crime and calling the police, because he wasn't specifically monitoring every single word said by every bar patron. But Facebook is casting a wide net by analyzing every conversation that happens.

Questions: should Facebook be permitted to do this? Should we ask for laws preventing companies from "eavesdropping" on their users' communications with the intent of detecting and reporting criminal behavior? Should this be the role of the democratically-elected government instead? Should sites be required to turn user communication over to the government for such analysis?

It's a fascinating area of law/politics with so much room for future development, and gets down to the heart of what values a society has.

olliesaunders 2 days ago 3 replies      
Does anyone know a site where all of the scary things (civil rights and privacy violations) that are going on have been aggregated? I sometimes get people asking why I'm not on Facebook. It would be nice to have a place to point people to about why because it's quite difficult to explain normally.
ck2 2 days ago 0 replies      
Somewhat related, apparently if you want to shutdown someone's paypal account and suspend their funds (and yours as well, be warned) just send them some money with the reason "drug money".

Apparently people have sent their friends money, rent, etc. and did that as a joke, boom, it's a nightmare.

chrsstrm 2 days ago 1 reply      
So what happens if Facebook's system flags a message, it is reviewed by their staff and then dismissed as non-actionable, but turns out to be the precursor to a severe criminal act? Does the blame come back on Facebook for failing to prevent this crime?
danso 2 days ago 1 reply      
Never assume anything you send online is private. If the service isn't monitoring you, your friends are. And if not your friends, then the people who share your friends' computers, or anyone who comes into possssion of it, have he potential to expose your communications.

And while this has always been the case ever since letter writing, electronic communication is so much easier to parse and distribute and copy on bulk.

chrisballinger 2 days ago 1 reply      
All the more reason to use encryption technology like Off-the-Record (OTR) Messaging (http://www.cypherpunks.ca/otr)! I've been working on an OTR-compatible iOS app called ChatSecure (https://chatsecure.org) that is capable of encrypting your Facebook chats (or any other XMPP service).
fl3tch 2 days ago 2 replies      
This looks like it's mostly targeted at sex predators, but I wonder if the system is also activated if you jokingly tell a friend that they are "smoking crack".
lignuist 2 days ago 1 reply      
Someone should monitor Facebook for criminal activity.
icambron 2 days ago 2 replies      
It seems like the correct approach is for Facebook to do only what's legally required of it, and nothing more. That would allow society to have a transparent debate about what, exactly, should be required, leaving FB policy out of it.

As I understand it, FB is currently only required to respond to appropriately specific subpoenas and warrants. If the cops want more, they should petition for laws to require that and we can all argue about it like responsible citizens. And we could equally demand more protection.

But this thing where sometimes FB voluntarily sends law enforcement bits of information and sometimes they don't based on poorly defined criteria is just creepy. And why does FB even want this responsibility? Isn't the simplest, most obvious model to say no by default?

Zenst 2 days ago 2 replies      
What if Facebook make a mistake, do they get done for wasting police time? Monitoring is all fine but it needs to be done independantly, anything else is a conflict of interest and something that FaceBook staff can abuse.

You know it would not supprise me one bit if FaceBook had staff monitoring this modding down every post that holds them in true^H^H^H^HBAD light.

Zenst 2 days ago 1 reply      
FB has a terrible reputation with regards to privacy without real justification.

This is not supprising in any way.

If you don't like this then don't do FaceBook - realy that easy I have found.

melvinmt 2 days ago 0 replies      
Now I'm pretty sure that my account was (temporarily) disabled for this reason when I posted a politically biased link about the refused Iranians at Apple stores.
freemonoid 1 day ago 1 reply      
I was once informed by an FB employee that federal agents are ensconced at the FB premises to monitor users' communications and shut down / censor FB groups and venues for "hate" speech and terroristic threats.
naner 2 days ago 1 reply      
What if Google did this with its properties (search, gmail, gtalk, etc)?
sageikosa 1 day ago 0 replies      
Decades ago, a friend of mine (call him Mickey) was under a DEA investigation due to some bone-headed thing he did involving one of his acquaintances (call him Ken) asking him to receive a shipment from Colorado for him.

Now the fun part is another friend of ours (call him Jeb) was in the habit of making movie quotes when he started phone calls, so he calls up Mickey and leads in with a Lethal Weapon 2 line about "shipments", completely unknowning that the DEA was potentially tapping the call.

Because of the way the warrant was written, Mickey was able to wave off the tap on Jeb's call since it only covered calls from Ken. But it could just as easily led to all sorts of other problems since between friends, the level of discourse can go far afield of what a non-initiated 3rd party might consider normal.

five_star 2 days ago 0 replies      
This news made me lose my interest in FB more. They continually go beyond user's privacy.
Zenst 2 days ago 0 replies      
Whilst FB have legal obligations in many countries I must say when I read "phrases that signal something might be amiss, such as an exchange of personal information or vulgar language" then the first thing that sprang to mind was nothing to do with crime. People swear, people exchange details. SO I guess alot gets flagged up to there staff.

Question is, do they warn you that your private conversation is not private and do they comply with the data protection acts the various countries have and more importantly who monitors FB? So many things can be taken out of context and acted upon in good faith at the detrement of innocent parties, this is concerning. But I don't do FB, nor do I have any immediate plans either. That has nothing to do with this, but more todo with concerns in general about there privacy and policeys they act out.

casemorton 1 day ago 0 replies      
So, "cooperation with police" now entails creating their own criteria for crime and dragnetting the presumed private conversations of a billion people.
I guess I better not confirm birthday parties or holidays with young relatives or friends kids on that site.
benthumb 1 day ago 0 replies      
In the post-9/11 era this is de rigueur and at some level socially sanctioned in the name of keeping us safe from terrorists and social deviants. And I don't think you can argue credibly against these operations w/o first interrogating the various pretexts that have set the stage for them: Oklahoma City, 9/11, 7/7, 3/11 etc.
katbyte 2 days ago 0 replies      
Is there any way to easily and securely encrypt Facebook chats? a quick google finds:


Kiro 1 day ago 0 replies      
How come every time there's a post about Facebook the usually intelligent HN crowd goes nuts? I can understand the concerns but this whole thread is filled with alarmists and looks like something taken from reddit (or even 4chan).
MRonney 1 day ago 0 replies      
Tailor your free speech for who is watching, or else you could get into trouble. Welcome to the new America that our fear of terrorism built.
neo1001 2 days ago 1 reply      
I once made a joke on fb on a friend and posted "you smoke doobies" and he was so shit scared that he took it down. Lol
Dalton Caldwell: Announcing an audacious proposal daltoncaldwell.com
155 points by csmajorfive  1 day ago   49 comments top 19
skue 1 day ago 1 reply      
As a hacker who just terminated his FB account, it sounds like I'm the target audience. But I have to confess that after reading the blog post and clicking through to the faux Kickstarter page, my response went from "That could be interesting" to "That's disappointing."

If they are going to leverage the Kickstarter model, then they should at least learn from other KS projects: show me the problem, build a prototype, show the prototype solving the problem, play cool music, and for crying out loud look at me (the camera) when talking.

There's probably a good reason that Kickstarter doesn't accept nebulous web businesses - scaling manufacturing costs real money, but building a software prototype is cheap. If someone had made a video dryly explaining GitHub or DuckDuckGo as a concept prior to developing them, I wouldn't have signed up. But I find them invaluable today.

mgkimsal 1 day ago 0 replies      
What was very annoying about the sourceforge story years ago was that the only way they tried to monetize it was by selling "enterprise" versions. I wanted to use SF (eventually did for a couple projects), but wanted to use it a lot, but didn't want the ads. They shot themselves in the foot by not offering an affordable subscription service. I'm happy to pay bitbucket/github/whoever $x/month, but not $yyyy/month.

I may grow in to needing the $yyyy/month plan, but most people don't start off at that end, and the majority of people who can afford something that expensive (because they're an operating concern with a lot of money) probably already have a solution in place (hence their ability to make money).

guynamedloren 1 day ago 4 replies      
> we have been spending the majority of our engineering resources the past 8 months building a “secret project”.... Additionally, we already have much of this built: a polished native iOS app, a robust technical infrastructure currently capable of handing ~200MM API calls per day with no code changes, and a developer-facing API provisioning, documentation and analytics system. This isn't vaporware.

So basically... it's already built.

> To manifest this grand vision, we are officially launching a Kickstarter-esque campaign. We will only accept money for this financially sustainable, ad-free service if we hit what I believe is critical mass. I am defining minimum critical mass as $500,000, which is roughly equivalent to ~10,000 backers.

Oh that's interesting. The grand vision will not manifest until $500k is raised. But isn't the product already basically finished?

So what happens if they don't raise $500k? Do they kill the product and flush 8 months of hard work (and presumably a bunch of money) down the drain? Doubt it. That doesn't make any sense.

Something is fishy here.

noelsequeira 11 hours ago 0 replies      
I'm not going to prognosticate about whether or not this service will take off. I really hope it does - the raison d'être seems genuine, or at the very least, passionately backed.

But that seems to be the very problem with it. In trying not to bury his lead, I believe Dalton Caldwell has let it take over the pitch. 95% "why" and 5% "what". 10 minutes in, I was still scratching my head wondering what the service will look like. And the name App.net certainly doesn't do the cause any favours. The first decent explanation finds itself relegated to question 2 of the FAQs.

This is an audacious attempt, and I laud that. But the pitch, in my opinion, needs an overhaul. Diaspora was also about the "why", but they addressed the "what" really early in their Kickstarter pitch.

jilebedev 1 day ago 1 reply      
What pain point exactly is this trying to solve?
That twitter is ad-spammed? I'll admit I rarely use twitter, but I can't recall ever seeing ads on it. (I can, as a side note, recall how /slow/ twitter pages load.)
That twitter changes its API frequently and doesn't give adequate notice to the people who develop for it?
How exactly does a user interact with app.net? I signed up for a twitter account because I wanted to hear a lot about a beta for a video game. How does this work with app.net? Are only broadcasters paying members?

Kudos for taking on the big dogs, though.

benatkin 1 day ago 1 reply      
Seems odd to not actually use KickStarter. Won't running a homegrown fundraising platform distract from solving the messaging problem? And don't you have to make a case for something that KickStarter already spent a lot of time making a case for? (I was turned off when I first heard about KickStarter, and I still am, to a lesser extent, but I finally pitched in for two projects.)
tgrass 1 day ago 2 replies      
..the takeaway here is that the services provided by SourceForge/Github are too important to its users to be ad-supported.

Why muddy the waters like that? The takeaway is that the incentives a company adopts are consequential and will define the culture and the product.

danso 1 day ago 3 replies      
I hate to be old school media here...but I wish this was announced on Monday or any other time besides Friday afternoon, when it will get less notice. Because it's a great announcement.
comex 1 day ago 1 reply      
So, this is a service where I'll pay $5 a month -ish to use a network that's basically the same as Twitter (in particular, not decentralized), except with no users and, considering the amount of effort being put into Twitter clients, probably inferior clients? It'll have no ads, but the ads on Twitter are pretty minimal from what I've seen, and presently as a third party client user I don't get any. It'll be open to "apps" which can extend the experience (and have a longer character limit), but to me Twitter's simplicity is its core selling point. It'll supposedly be used by the top 10,000 hackers, but identi.ca already tried that and it didn't work - network effect is important.

I love the spirit of this, but as described, I wouldn't use the service if it were free. Twitter isn't even in the same ballpark as SourceForge. shrug

mgkimsal 1 day ago 3 replies      
Seems to me that 'ad supported' is always going to be more attractive to funders/investors, because the potential is effectively unlimited (certainly by comparison with paid-for services).

With pay-for services, there's a much smaller number of people who are going to (or be able to) pay, and when you start looking at those numbers, it's never the fabled 'hockey stick to heaven' that people dream of.

What if twitter just sold access to their stream, and a few million orgs (companies, individuals) were paying, oh, say, $20/month. And let's say... 2 million - why, that's only $480/million per year maximum! Gosh - who would ever want to invest in that? Instead, by going with 'ads', there's always the promise of some big change that could explode the revenue down the line.

sayemm 12 hours ago 0 replies      
Anyone else thinking about Diaspora as soon as he mentioned Kickstarter?

Grandiose vision, but going to war with Twitter today is as foolish as trying to take on Facebook.

You're not going to attract a mainstream audience and gain serious critical mass just because you're an open platform and developer-friendly. In the case of SourceForge/GitHub, it was different because the demographic is entirely developers. Not so with social networks or media companies of any kind, the audience is mainstream, that means college students, celebrities, high school girls, etc.

"Every battle is won before it is ever fought" (Sun Tzu) -- learn from Diaspora's failure, don't repeat it.

drumdance 1 day ago 2 replies      
Am I the only one who doesn't feel like Twitter is hammering me with ads? OTOH I don't use Twitter very much and when I do it's almost exclusively on my phone or iPad.
sylvaincarle 9 hours ago 0 replies      
Amazing that there is no mention of http://Status.net anywhere this topic is discussed.

Evan Prodromou has been trying to crack that nut for many years now...

fauigerzigerk 22 hours ago 0 replies      
I'm probably not the target audience for this because I don't understand what the proposal is. Or maybe I'd have to read some of his earlier posts to understand what this is about.

What is a real time feed and service or a social platform in general? Is this basically Facebook sans the UI for a monthly fee?

mkramlich 17 hours ago 0 replies      
Summary: "I propose receiving $500k USD of your money (with terms like Kickstarter, but not actually via Kickstarter) to build my new startup, in exchange for giving you zero equity. I know it's audacious."
state 1 day ago 1 reply      
I love this idea, but the only reason I can come up with that you need half a million dollars to do it is that it's a pivot, and not a bootstrapped enterprise.

If GitHub is being used as the example in this line of argument, then why not follow their lead and build something people need and raise money later? Wouldn't you then be even more likely to convince hackers " the primary community you're trying to serve?

khangtoh 1 day ago 0 replies      
Why screw over all the developers who had been using App.net? I assume that their App Landing page developer service is going to retire if this new product goes.
maxogden 1 day ago 1 reply      
Are there github accounts for the 'team to do it' that contain the 'infrastructure'?
rudiger 1 day ago 0 replies      
Will there be a free tier?
Pay Too Much for everything allentucker.com
154 points by denzil_correa  2 days ago   160 comments top 48
beloch 2 days ago  replies      
The idea in this article is sometimes true, but it completely fails to take into account the relentless march of technology. Some items, such as cell-phones or laptops, will be totally obsolete within 10 years no matter how finely made they are. Other items, such as guitars, will not be obsolete since the technology isn't changing quickly, if at all. There will be new digital guitars with funky synth modes every year, but a gibson dobro will still sound like a gibson dobro in twenty years.

Some things fall somewhere in between. Your grandfather's watch is a marvel of mechanical engineering that will keep time adequately for centuries if taken to a watch-shop for maintenance every few years. That maintenance will cost more than buying a new timex would, and the timex will keep far more accurate time. Timex's are disposable. When the timex's battery dies it will probably cheaper to replace it than to replace the battery. However, their function is superior for all practical purposes. Many people currently enjoy the aesthetics of mechanical watches. They're currently enjoying a major surge of collectibility, but that wasn't the case twenty years ago and it may well not be the case twenty years from now.

The real trick is to figure out what can last and what should be treated as disposable. e.g. Say you're building a home theater. Amps have changed very little over the last 20 years. Preamps and receivers go obsolete every few years. If you spend $5000 on a Bryston amp it will easily last 20 years (that's how long it'll be under warranty!), but a Bryston preamp will be hopelessly unusable long before that 20 year warranty runs out.

There are also big variations in short-term durability too that aren't necessarily correlated with price. e.g. Macbook air's and the new retina pro's are gorgeous pieces of engineering, but they're made to be disposable. If you spill coffee on them they're basically done. You can't remove the battery and take them apart to clean them (proprietary screws) like you can with a much cheaper laptop and Apple won't lift a finger to help you. This is a case where our perceptions of what is durable and high-quality can actually lead us to buy something that won't last as long!

latch 2 days ago 7 replies      
I don't agree with most of this. The idea that price equals quality has long been used by merchants to extract money from consumers, not extend value. Sure it's true of some products and some companies, but overall I'd argue that at best, more expensive things aren't close to being relatively better - by any meaningful measure.

A $300 meal isn't 10x better - in terms of taste (subjective obviously) or nutrition - as a $30 meal. A BMW 3 Series is a better car than a Ford Focus, but looking only at the practical reasons to own a car, it's not even close to having twice the value.

In a lot of cases, brand names bring no value other than the brand. And while that can have a placebo effect, it's not more value. Medicine comes to mind as a great example. You are paying more for marketing and packaging than you are for R&D. The recent thread on memristors even highlighted that R&D costs are 1/100th of total cost, with marketing taking the lion's share.

Quality items appreciate while cheap items depreciate? No consumer good should be viewed as a financial investment. If something does appreciate in 50 years, it'll be more about luck than anything else.

wccrawford 2 days ago 3 replies      
A while back, I stopped buying the highest-priced goods and started doing research instead. What I found what the best things weren't the highest-priced. I also found that I could find the best items for me because what I care about isn't necessary what others care about.

Using this, I've managed to buy better shoes, toasters, and even a car than I've ever had before, and at less than I would have spent if I hadn't done that research. The car ended up being the cheapest car made by that manufacturer, and I like it better than any other car I've ridden in or driven. (Admittedly, I haven't tried anything over about $50k, but I couldn't afford those anyhow.) I would never have found it if I used price as an indicator of quality, instead of reviews.

The key to reviews is to look over the good reviews. Look at the bad reviews instead. Find out what people hate about it. And then ask yourself: Does that feature matter to me?

My rice maker doesn't make brown rice well at all. Many people complained about that. However, white rice is the only kind I make, and it does a great job on that. So it didn't make sense to spend twice as much money on a better rice maker. I could have spent more money and gotten an objectively better rice maker. But why bother?

So no, I don't agree with paying too much or buying the 'best'. At all.

PedroCandeias 2 days ago 1 reply      
I don't think the Op is getting as much love as he deserves.

  Many of the intangible pieces that make up the quality of a product or service
go out the door when we're getting a deal. They're doing a favor and sometimes
when we bargain, people resent us.

This is absolute gold when it comes to longer business relationships. I have come through to believe that entirely too many fail simply because someone is resentful over terms.

As for the relation between cost and quality, I agree with many here that it isn't clear-cut. But the relation does exist. You just need to be discerning, which is why the Op writes:

  Because we only buy quality, we are forced to wait until we can afford what we
really want. That wait time leads to better decisions, and it forces us to
make due with what we have.

# Edit: formatting

drostie 2 days ago 2 replies      
One key way in which the article is right on-the-money, so to speak, is to highlight the ridiculous frenzy that we go into to save disproportionately small amounts of money.

Our audience here is mostly programmers. Programmers should all know the golden rule of optimization: profile it first. Don't take it on intuition or faith that "this is what was slowing me down" unless you've measured it. The same principle applies in personal finance. Group together your expenses into some key logical categories -- "groceries", "tech", "rent", "retirement", "health", "nights out", and whatever else makes sense to you. Then add them up, do the maths. Know what you're spending and how it compares, and how the categories make up your monthly budget.

Frugality is generally quite good, but just like code, it's quite possible to overoptimize it into a time-sink.

Also, it's key that you see both the high-cost single-time expenses and the low-cost-but-everyday expenses and you get an idea for how they compare. Depending on how often you visit Starbucks, it might be a very good or very insignificant change to brew your own coffee at home; go on and measure it.

The single most surprising thing that I discovered about this process was that, even though I feel like I give money to beggars in the street "often" (i.e. whenever I see them and they ask) and give "a lot" (i.e. much more than is normally typical), the amount that I give actually works out to almost nothing on a monthly basis. Like, I'm poor and right now I have no proper employment (they don't pay you to do a Master's degree) and yet I can still afford to be generous to the people who are down on their luck, and it's just nothing when you compare it to the amount I'd save if I stopped drinking cola. I've got no income, but I'd still easily spend € 50 on food to throw a party for my friends, so long as it happens only about once a month, twice max.

scott_w 2 days ago 0 replies      
This article is based on a number of nonsensical assumptions:

1) That paying more equates to higher quality. It doesn't. Your £30 t-shirt will fall apart every bit as quickly as your £3 multi-pack from Primark.

2) That you want to have something last a lifetime. Just last weekend, my dad was complaining about having no good jig-saws. Then followed up with "of course, I could buy a good one, but I only use them once every 5 years, so they're knackered when I need them anyway".

3) When you want quality, you can't just borrow it off someone else.

I'm not against paying more for something e.g. my uncle has expensive joinery kit because his day job is a joiner. I simply dislike the mentality of "I must pay top-dollar for everything", before you've evaluated what you actually want.

suprgeek 2 days ago 2 replies      
I think what the article is trying to say is "Buy the best quality that you can afford". Unfortunately the Author has completely conflated Quality with Price.

Higher price != Better Quality (not always anyway)

praptak 2 days ago 1 reply      
As always with the "Do <not-very-popular thing> and get <obvious-benefits>" articles, there is an obvious question: why doesn't everybody do that? I cannot speak for other people but here are my reasons for not always going for the quality over price.

First, I obviously cannot always afford the high price. When I bought the car I had checked the failure reports and decided it just wasn't worth paying an additional few months salary to get the failure probability down 2-3 percent a year (checked other parameters too, similar conclusions.) And yeah, I waited till a good deal was available to me.

Second, getting a quality item might cost even more of my personal time - finding out which one is actually good and searching for a local vendor isn't exactly free.

Third, maybe I don't even want an expensive item that lasts forever. Suppose I want to get into photography but don't really know if I'll keep this hobby. I'd intentionally google "cheap beginner dslr" and stick with the findings. Chances are that by the time it needs replacement I'll be ready for a more fancy gear or bored with the new hobby. Win-win :)

colomon 2 days ago 0 replies      
I'm not sure I'd say this article is wrong, exactly, but it sure seems like all the arguments in it are very shallow. Yeah, I have my great-grandfather's pocketwatch, and it still runs. But it has required repairs that each cost significantly more than buying ten brand new cheap watches would have. It's the least reliable timepiece I own. And I hardly ever use it, because my cellphone is always with me and is considerably more accurate -- and is not the only heirloom I have from my namesake great-grandfather! Its main value to me is pure sentimentality.

I'm sure other watches were purchased by my other great-grandfathers; to the best of my knowledge, none of those watches survived to the present day. It might be because they only bought cheap watches. But it might also be that they bought expensive watches and those watches broke or got lost. Looking solely at what survived to today tells you very little about what was purchased yesterday.

And I'm sure my great-grandfather didn't buy the most expensive watch possible. If it had come down to the choice of buying a watch which cost $10 more, or saving that money to give me $200 today, I'd take the money!

morsch 2 days ago 0 replies      
I wish the article was true. It'd be so nice to simply pay a premium and rest assured that it's an equitable approach: you get an improved product, the retailer gets a higher profit, the distributor, the people actually creating the product get a cut.

The reality is you can't rely on any single of these to be true, and as soon as you're willing to buy something and especially if you're willing to pay 10, 50 or 100% more than the absolute minimum you're the prime meat of the other half of the worlds who make it their business to ensure that it's not an equitable approach.

wanderr 2 days ago 1 reply      
On the other hand, I've noticed, and others have observed as well, that generally more expensive items are more likely to have poor reviews on Amazon, for example. The most likely explanation is that higher prices lead to higher expectations, which are often hard to meet.

Another point I'd like to make is that cheap stuff can be totally awesome, especially if you don't need or want it to last a lifetime. Example: IKEA. My girlfriend changed her mind about the furniture she wanted in her apartment several times over the few years she was there, and was able to update the look to be exactly what she wanted for very little money. Compare that to the armoire I payed an ass load of money for that will surely last a lifetime, but doesn't really fit anywhere in my current home.

mattm 2 days ago 0 replies      
The highest priced items are that way mostly due to the marketing costs associated with selling high priced items and trying to create the illusion that it is better quality.

Lifetime guarantees are pretty much worthless because most people will forget they have one, lose the item or lose the documents related to it.

Top pay doesn't attract the best people. Top work attracts the best people because the best people have most likely realised that money doesn't really matter if you're working on something that doesn't provide meaning for you.

The sweet spot for quality is to figure out the approximate average price of the product you want to buy and then pay a little above that.

mathgladiator 2 days ago 3 replies      
I've found this to be very true in many ways when buying 'quality'. Buy 'quality' and not 'crap', and you will always be happier and the OP's statements are all true.

I think price is a poor estimator for quality since marketers know that price alone can signal quality, so it stands we need something beyond price.

How do we determine if something is quality when price is unreliable.

terhechte 2 days ago 2 replies      
The russians have a saying for this: "I'm too poor to buy cheap". Really says everything about it.
Wilya 2 days ago 2 replies      
I definitely disagree. Maybe sometimes, paying twice as much gives you something that lasts ten times longer. Or maybe, you will pay five times as much, and get something that (hopefully) lasts a bit longer.

I use a lot of earphones, and they tend to break quite fast. Once or twice, I decided to move away from the usual 15$ models and pay 60$ for better brands, with good reviews, and everything. Did they last longer ? Of course not. They sounded marginally better, but nothing worth the price..

The thing is, price is an indicator of positioning. It's a marketing tool, and it's generally not correlated to quality. Buying the very cheapest is usually not a good idea, because the manufacturer probably did everything it could to actually beat competitors and propose the lowest price. But other than that, that's pretty much all you can know in advance.

Knowing what items are made of good quality and will last longer (and therefore are worth paying an extra for) is very hard. So, my reasoning on that is that I prefer to spend less money on something, since on average, it will live just as long.

crazygringo 2 days ago 0 replies      
A lot of people tend to buy middle-priced things. Marketers rely a lot on this fact, actually, to take advantage of your ignorance.

I've found it's much better to avoid middle-priced things, most of the time. Usually, the cheapest option will meet your needs just as well.

But then you need to identify the things in your life that really contribute to your happiness, and then pay what's necessary for full quality in those.

I spend top dollar on fresh meat and vegetables and on my kitchen pans and chef's knives and olive oils for salads. But I have the cheapest vegetable peeler and spatula and paring knives and toaster and olive oil for frying.

nazgulnarsil 2 days ago 0 replies      
People who are too lazy to ammortize will spend a lot more money in the long run. The perfect example is the hilarious subsidized cell phone market in the US.
mmcnickle 2 days ago 0 replies      
For me the most difficult thing about buying "quality" is the effort required in researching what the quality product is. It's been mentioned that price isn't always a good indicator, and the mainstream stores generally don't cater for people looking for quality.

For example, I'm buying a set of screwdrivers for a DIY job. I know I'll need screwdrivers for the rest of my life, I want to buy quality ones. I go to some big DIY store expecting a good range of screwdrivers. What I get is a selection of mickey-mouse screwdrivers (£5 for a set of 20) and some middle-of-the-road-but-overpriced screwdrivers (£20 for a set of 20). There is no "buy these and never buy another screwdriver again" set for £60-100.

You can get these, but it takes considerable effort to track them down.

wickedchicken 2 days ago 0 replies      
Feeling insecure about your place in life? Worried you're not happy enough, cool enough, or loved enough? You clearly aren't spending enough money. Don't bother creating things or building interpersonal relationships. Spend more time thinking about acquiring physical goods and making sure you're only getting the best.
fsniper 2 days ago 0 replies      
But there is a "new - well not very new exactly -" economical trend to build products that have an end of life carved on them. So buying expensive stuff does not mean having long lasting products anymore. It was an old time story.
debacle 2 days ago 1 reply      
The problem is that the modern consumer cannot associate cost with quality. I usually buy the cheapest appliance I can find (microwave, coffee machine, lawn mower) because it has the fewest bells and whistles and is likely the only one I can fix myself.
webjunkie 2 days ago 1 reply      
One has to differentiate. There are things that are worth to have once and for life. But most things I gladly buy cheap - because I know I can dispose them later and not a fortune is lost. What If I spend much money on fashionable things that I like, but dislike in 2 years? I rather swap out most of my cheap stuff over the years rather than worry what will last longest and if I might still like it and so on.
arjunnarayan 2 days ago 0 replies      
What about the selection bias? You don't see the watches your grandfather bought that didn't make it down to you because they broke. Of course the few carefully selected things that have survived and been handed down are the exceptional pieces!
Havoc 2 days ago 0 replies      
I associate old with quality more than price.

e.g. My dad has 2 Bosch drills ( http://i.imgur.com/aCkaq.jpg ). Roughly 20 years old. When you hold them in your hand you just know that it'll last another 40 years. As a result I have a really difficult time taking any of the modern drills seriously.

UnoriginalGuy 2 days ago 2 replies      
The problem with this "buy for life" stuff is that how do you determine what is really a "buy for life" product rather than just an over-priced "brand?"

For example, I could go out and buy a $200 pair of sunglasses but depending on where I buy them they can either be poor quality (e.g. fashion brands, sunglass hut, etc) or exceptional (e.g. fishing shops, sailing suppliers, workman-glasses, etc).

You would expect reviews to give you an accurate way to tell but people often review a product by the way it makes them FEEL rather than about the product its self.

prawn 2 days ago 0 replies      
Reminds me of some gift buying advice I read. Given a specific budget, don't get a big-but-cheap item, but a higher-quality but smaller/simpler item. The latter is more likely to be valued and retained.

I think the example given, for a golfer, was a crafted/precious golf tee over a trashy electronic putting game. (Ignoring, of course, the fact that the tee might easily get lost.)

typicalrunt 2 days ago 1 reply      
I liked the article in general, but this got me:

I usually eat at the same few restaurants all the time. They're maybe 10% more expensive, usually locally owned, and the food doesn't come out of a frozen pre-made bag before being tossed in the oven. I never tip less than 20%, and I'm not an asshole….at restaurants.

I don't understand this thought process. Why not pay 100% tip then? Is the 20% number somehow magical and affords the ability to be elitist? So if I tip 15%, which is the average suggested tip in Western Canada, I'm somehow an asshole???

ajaimk 2 days ago 0 replies      
The key part of that is the line about services and tips. Just being a generous tipper gets you better service, and specially if you are a regular at the same restaurant or bar. 20% by default with an occasional bump up to 25% in tip gets you remembered and the servers get to know you.

I've had such a good relationship at some of my regular restaurants to the point where I once had ups deliver a package to the bar for me when I was out of town. Other places, the staff starts comping your drinks. My best one was the manager at a restaurant giving me a blanket 20% discount on anything I buy (I tip at least 25% after they put in that discount).

In the long run, you get better service if you pay for it.

Loic 2 days ago 0 replies      
The author is right but the problem at the moment is that more and more, you cannot correlate quality with price. That is, expensive gears can be of low quality. It is harder and harder to even be able to pay more knowing that you get more.

Also, the point missing is that you need to take care of your expensive gears. You need to put money into maintenance.

jackalope 2 days ago 0 replies      
Design determines manufacturing costs. Engineering determines durability. Marketing determines price. Coordinating all three is essential to delivering value. It's absurd to suggest that any one (or all) of them offer protection from obsolescence.
tchock23 2 days ago 0 replies      
I agree with most of the ideas in this article with the small exception of the example he gives for services and hiring "the best firm in the country." I think you're mostly paying for overhead and bloat in those cases, and I would prefer to work with "the guy down the street" if I can (in which case I'd want to find the best "guy down the street," regardless of cost).

Other than that, it amazes me how people won't spend for quality in situations where it clearly is in their best interest for the long-term...

parka 2 days ago 0 replies      
The article reminds me of what I learned in a marketing:

Value = Benefits / Cost

Marketing comes into play. If not, items for sale would be using cost-based pricing.

A Mac and a PC would both can accomplish the same task. But one is more expensive than another. Being cheaper doesn't necessarily mean it won't last as long.

Tipping is one area that I don't understand about cultures that have them. Generally, I would expect the tip to be included in the price of the food. Why waste my time and have me figure out what amount to tip? Personally, I don't see the perceived value of tipping.

In some way, how much you pay is also determined by how you see yourself, or in some cases want others to see you.

jakeonthemove 2 days ago 2 replies      
Instead of overpaying, I think it's better to get the best product you can for the money you have. It does require you to learn how to recognize quality and know the prices, but it pays off in the end.

For example, I wouldn't buy a Ferrari or even Porsche 911 (both high quality cars, no doubt, but expensive) when a Nissan GTR R35 performs just as well and costs less.

All three will last a lifetime with a bit of care (I hate how people just use their tech without any maintenance, then wonder why "the POS" broke down!), so why pay more (aside from the design)?

berntb 2 days ago 0 replies      
I got a really good workgroup printer, paid more than twice as much as for a cheap one.

I loved it but had to leave it when I moved -- just too heavy.

(And anyway, the Mac support wasn't good a couple of O/S versions later, but it still worked well with Linux.)

My takeaway from this has more to do with negotiating moving packages (and Samsung support of old customers, grumble) than anything else.

prezjordan 2 days ago 0 replies      
This is how I feel about my computers. I don't want a "good deal" when it comes to price. I'll spend a lot, and get a lot in return.
maciejgryka 2 days ago 0 replies      
I highly recommend The Paradox of Choice http://www.amazon.com/Paradox-Choice-Why-more-less/dp/006000...

To quote from the article:
"People who constantly try to always get that great deal end up spending all their time chasing those deals and never actually get things done. I've seen people do this their entire lives, and it is debilitating."

You can say the same about people who spend all their time trying to constantly get "the best" product. Figure out a few things that are important to you and maximise these - whether on quality or price. For the rest, learn to accept "good enough" and get on with your life doing things that matter for you.

denzil_correa 2 days ago 1 reply      
> This is so weird to me. No one haggles over $5 on the price of a car, but it seems that everyone needs a tip calculator to determine if they should pay 21.50 or $22.00 for a meal.

A $5 on a 2000 is 0.25% and a $0.50 on $20 meal is 2.5%

b0rsuk 2 days ago 1 reply      
We have a saying:

"Poor people can't afford inexpensive stuff."

saturn 2 days ago 2 replies      
> My favorite pair of jeans gets worn 10 times more often than my other jeans. If I did away with the other jeans, I could afford to buy more of those things I really love.

This took me embarrassingly long to realise. Instead of buying clothes because they were pretty nice and on sale, buy only what you absolutely love, and pay full price. Instead of having "favourite underwear", get rid of everything that's not your favourite and make sure you only own favourites.

Yeah, it costs more at first. But over time you build up a wardrobe of high quality clothes you love. Quality over quantity, indeed.

mathattack 2 days ago 0 replies      
Great modern defense of penny-wise and pound-foolish.

The challenge is automation and commodification pushes us towards the lowest common denominator. Look at airlines, being forced into price, not quality.

allentucker 2 days ago 0 replies      
Author here - There are many good points here. The purpose of my article was to give a different perspective from the default "buy whatever is cheapest" method that I believe most people follow. Obviously it would be an error to buy things that that aren't better for more money, but I believe this error is often better than the opposite error, buying twice. A good example: I'm writing this on a 10 year old laptop running 1900x1200. Thanks for the comments and feedback.
alain 2 days ago 0 replies      
On the topic of paying less, there's a saying which goes something like this :
"the cheapest price you can get for a thing is to not buy it" [1].
Of course this can't be applied for everything but a lot of people, myself included, tend to buy a lot of gadgets they don't need and don't use.
This is a simple advice but I think it can go a long way to saving money.

[1] if someone has the exact quote, I'll be thankful for it.

logical42 2 days ago 0 replies      
Would have preferred re-reading the Wikipedia article, personally (en.wikipedia.org/wiki/Time_preference) as it was better written, and considerably less self-satisfied in tone.
scott_meade 1 day ago 0 replies      
A lot of commentators missed the point: "Sometimes a cheaper product is actually better. But consider removing price as the default decision criteria."

It's simple and good advice.

cdvonstinkpot 2 days ago 1 reply      
Looks like an interesting article, but I can't bear to read it. Too much side-scrolling on my Blackberry. Why doesn't the text autofit my screen like other pages do?

I'll have to read it when I get home.

jwl 2 days ago 0 replies      
Partly true, but I guess most of don't buy new things when the old ones are totally worn out. It is more like we just some new stuff for a change, not necessarily because what we have is broken. So it doesn't automatically save you money.
nsxwolf 2 days ago 1 reply      
Generally: Cheap jeans last forever. Expensive jeans tear in about a week.
martinw2k 2 days ago 0 replies      
Well, that was a very long way of saying "you get what you pay for".
Apple is back on EPEAT apple.com
152 points by kellysutton  1 day ago   168 comments top 17
droithomme 1 day ago  replies      
"All eligible products" is a key, the new iPad with its glued in battery is not among them.

The "more efficient and longer lasting" case materials he brags about are irrelevant in their disposable products made of low quality components that are not user serviceable or upgradable.

It's like bragging you made your automobile frame out of solid titanium with a carbon fiber shell, but ignoring the fact that you built the engine without any way to change the oil, so you're going to be throwing it out or sending it in for major costly service after a short time. Such planned obsolescent designs are certainly not environmentally sound, and claims of the longevity and strength of the frame materials, and even of certifications, are just PR to distract and hypnotize the marketplace into believing the opposite of the reality of the situation.

mkaltenecker 1 day ago 4 replies      
I'm thinking about whether this is an atypical reaction by Apple and I think it's not but I'm not sure.

The pattern is typical " more or less. Some criticism appears. Apple is dead silent for a few days. Apple has a comprehensive response to the criticism. (Alternative that also happens frequently: Apple doesn't ever mention the criticism.) That's what used to happen in the past, that's what nearly happened here.

The difference is that they responded with a different message pretty quickly after the criticism (arguing that EPAT isn't such a great certification) so it's not true that they staid completely silent.

When it comes to the message itself, I don't think it's that atypical. Apple rarely responds to criticism, so there are few situations we can use to compare. It's not as big a deal as Atennagate " so they picked a less involved way to respond (basically a press release instead of a press conference) " but in every other respect it's pretty similar.

This time there is a clearer Mea Culpa but the undertone is still that EPAT is a bad certification. (During the Antennagate press conference the undertone was that it's not really that big of a deal " and it was a much more obvious undertone.) The tradition of Apple execs writing letters is also continued.

I would only say that Steve's letters tended to be more about presenting arguments. That has certainly something to do with the different purposes (explain why DRM/Flash are bad vs. admit that you were wrong and reverse direction) but I still would have preferred if Bob Mansfield had explained more of Apple's reasoning.

younata 1 day ago 2 replies      
The retina macbook is rated EPEAT gold.


unreal37 1 day ago 11 replies      
"Apple makes the most environmentally responsible products in our industry."

How is this possibly true? Nothing Apple makes can be easily opened, upgraded, or have their batteries replaced. They are made to be obsolete in 2-3 years.

When my iPad screen cracked, they made me buy a new one (at discount) instead of fixing it. I doubt they even fixed the old one I gave to them - just discarded it. How is that environmentally friendly?

I own almost everything Apple makes. But they need to better explain how built-in obsolescence and impossible-to-fix devices equates to environmentally friendly.

swilliams 1 day ago 1 reply      
One interesting thing to note is that this is from Bob Mansfield, and not Tim Cook. When Jobs was CEO, I don't remember a subordinate ever releasing something like this.
derwildemomo 1 day ago 2 replies      
"I recognize that this was a mistake." strikes me as a very un-appleish communication style.
schiffern 1 day ago 0 replies      
Things that everyone seems to be forgetting

* Apple makes thin, light, durable products. Reduce > Recycle.

* Raw materials are a small amount of the embodied energy in electronics. The microchips themselves constitute many times the embodied energy. Again, reduce > recycle.

* As others have pointed out, Apple didn't do this because any of their newly-released products weren't eligible.

Putting it all together, Apple did this to send a message to EPEAT: "Disassembly isn't the end-all be-all of green." Looks like EPEAT caved.

dannyr 1 day ago 4 replies      
When Apple decided to leave EPEAT, Apple Apologists/Fanboys said it's because EPEAT is outdated.

I wonder how they are going to spin this one.

digitalengineer 1 day ago 0 replies      
"Our relationship with EPEAT has become stronger as a result of this experience, and we look forward to working with EPEAT as their rating system and the underlying IEEE 1680.1 standard evolve"

Reads to me like EPEAT is moving (a bit) in Apple's direction with the new future IEEE-standards.

hollerith 1 day ago 0 replies      
One datapoint:

Almost all power cords and extension cords in the US contain lead: the lead is added to the plastic part of the cord when the plastic is still "molten" and its purpose is to make the plastic less flammable.

Although I did not do a chemical assay on it or anything, I am pretty sure that the power cord Apple included with my 2011 Mac mini contains no lead. (The cord has a different, more rubbery feel to it that strongly suggests a completely different material, and I might have seen a claim to that effect somewhere on the web.)

mtgx 1 day ago 1 reply      
But does this mean their products will now be more recyclable and whatnot? Or it just means they put their EPEAT badge on the site again, although their new devices are still not as compliant as they used to be before?
recursive 1 day ago 1 reply      
What a coincidence! It just so happens that EPEAT is now relevant again!
rabidonrails 1 day ago 0 replies      
I'm a bit thrown by the overall tone of this announcement. I'll totally on board that Apple is doing its best to make "green" products and, apparently they're leading the way - great!

But if it's all true, why did they pull the products from EPEAT at all?

abcd_f 1 day ago 0 replies      
> Signed Bob

A middle finger, the Apple way.

guscost 1 day ago 0 replies      
Well, I have less respect for Apple now, and still none for Greenpeace.
tscrib 1 day ago 0 replies      
Glad to see that Apple is listening to customer feedback and acting on it.
hell0_th3r3 1 day ago 0 replies      
geez apple, get it together. i was mildly impressed by tim cook when he took over, but now i'm getting the impression that he is apple's steve ballmer. this was a ridiculous and trivially preventable gaff.
       cached 15 July 2012 04:11:01 GMT