hacker news with inline top comments    .. more ..    8 Aug 2017 News
home   ask   best   3 months ago   
Internet Draft: Let 'localhost' be localhost ietf.org
365 points by beliu  9 hours ago   115 comments top 19
user5994461 8 hours ago 6 replies      

 First, the lack of confidence that "localhost" actually resolves to the loopback interface encourages application developers to hard-code IP addresses like "" in order to obtain certainty regarding routing. This causes problems in the transition from IPv4 to IPv6 (see problem 8 in [draft-ietf-sunset4-gapanalysis]).
That does remind me of the times I was dealing with weird connection issues in some critical services.

It turned to be related to the use of "localhost" in the configuration. It resolves to ipv6 on some systems and that breaks everything because the target app is only listening to the ipv4 address.

Went as far as removing all references to localhost and added lint errors in the configuration system so that noone could ever be able to give localhost as a setting in anything.

nhance 9 hours ago 9 replies      
If this doesn't happen or takes too long, there's always lacolhost.com and *.lacolhost.com. I own this domain, have registered it out until 2026 and vow that the domain and all subdomains will always redirect to localhost.

It's easy to type and easy to remember and should always do a good job of expressing intent of usage.

jonathonf 8 hours ago 3 replies      
I've had web browsers perform a web search for 'localhost', or even just redirect me to localhost.com.


tcbawo 1 hour ago 0 replies      
At work someone once spent hours trying to resolve a network issue. Turns out he didn't have a localhost entry in his /etc/hosts and some sadistic person had created a VM named 'localhost' that registered a DNS entry via DHCP.
feelin_googley 5 hours ago 1 reply      
At least on the OS I use, which is more IPv6 ready than most, /etc/hosts solves this "uncertainty" problem.

I have found that failing to include a localhost entry in the HOSTS file can lead to some strange behavior.

If there are "computers" out there that have no /etc/hosts or deny the computer's owner access to it, then maybe it might be time for an Internet Draft from Joe User.

There should always be a mechanism for the user to override the internet DNS. And applications should continue to respect it.

bryanrasmussen 8 hours ago 0 replies      
this reminds me of a class I went to at a major company in 1999, we had problems following the setup instructions which included going to localhost/db-admin-path, after some sleuthing it turned out somebody 'in corporate' on the network we were using had named their computer localhost.
ccheever 3 hours ago 0 replies      
One time I was debugging a problem for a user of our desktop software (I work on https://expo.io) by sharing his screen and taking over his computer. And it turned out the reason the user was having problems was that in his /etc/hosts file, he had an entry pointing localhost at the IP address of some other computer on his network. Crazy. I have no idea how anything worked on his machine.

Took a while to track that was down. Was both bewildering and sort of satisfying to figure it out in the end.

sgtpepper43 1 hour ago 0 replies      
Just add a line your hosts file mapping lolcathost to and you never have to worry about it again.

No that's not a typo

chr1 7 hours ago 2 replies      
Does this mean that an entry in /etc/hosts assigning ip to localhost will be ignored?
eduren 8 hours ago 3 replies      
Can anybody with more knowledge point out techniques that this would break?

Are there any software or networking patterns that currently rely on localhost _not_ resolving to the loopback?

EDIT: The RFC mentions that MySQL currently differentiates between the two, but that's it.

filleokus 8 hours ago 3 replies      

 The domain "localhost.", and any names falling within ".localhost.", are known as "localhost names". Localhost names are special in the following ways []
Is this not implemented on macOS or am I just misunderstanding?

 ~ ping test.localhost ping: cannot resolve test.localhost: Unknown host ~ ping localhost.test ping: cannot resolve localhost.test: Unknown host

ericfrederich 5 hours ago 1 reply      
Sounds reasonable, but would probably break a ton of stuff. Does this provide enough benefits to outweigh the downsides?
age_bronze 8 hours ago 1 reply      
There was no RFC for localhost yet?! That's pretty surprising... That this RFC have any practical meaning? People didn't actually register localhost. domain, did they? Is there an actual line of code that this should change? Are they just trying to promote writing localhost instead of
nailer 6 hours ago 0 replies      
Surprised the more common .localdomain is omitted as a domain rather than having a .localhost domain.
inopinatus 4 hours ago 0 replies      
I would very much like to see this draft extended to cover SRV lookup as well.

Right now, section 3 of this draft would prohibit all SRV queries for localhost, which may hinder development and deployment of a SRV based application. That's an immediate problem.

But not only are there existing applications to which it is immediately applicable - it is a design error in HTTP that plain address records are used for resolution. One day this will be corrected, in which case measures like this should continue to apply.

agwa 8 hours ago 2 replies      
To be clear, this is not an RFC yet. It's not even adopted by a working group, although I hope it will be.

Mods: can RFC be removed from the title? [Edit: thanks for updating the title!]

lolcalhost 8 hours ago 1 reply      
This sucks. I have registered and am actively using a 'localhost' domain name under one of the new generic TLDs for for emails and account signups for quite some time now.
alexellisuk 7 hours ago 1 reply      
Localhost resolving to IPv6 basically breaks with Docker they unless you give special instructions only listens on IPv4. With curl for instance you can use the -4 parameter but probably best we start saying test the site on in tutorials.
pmarreck 8 hours ago 2 replies      
Why couldn't they just redirect "localhost" at the DNS level to
Some insurers insist that patients forgo generics and buy brand-name drugs nytimes.com
34 points by iamjeff  3 hours ago   25 comments top 11
AdamN 28 minutes ago 0 replies      
The big secret is that bio-equivalent and generic drugs sometimes aren't effective!



Frustrating that the article doesn't point that out. These new formulas do not need to prove they work, they just have to prove that their active ingredient is the same after the patent expires.

cbanek 2 hours ago 1 reply      
I have United Healthcare, and I have to say I've seen this happen to me as well. I'm on a long term maintenance medication, which is delivered by patch. I was on the generic in my previous health plan, so my new doctor prescribed the same generic.

I used the mail order pharmacy and they told me that a 90 day supply would cost $347. I asked why it was so much, and where my prescription benefits came in. They said they didn't cover the generic, and I said well do you cover the name brand one? They said they did, but they couldn't give it to me because my doctor had ordered the generic. If I ordered the name brand, it would be $100 for a 90 day supply, which is a huge difference.

I called my doctor, and got them to change the prescription. None of the people at United Healthcare were offering any of this information, and I basically had to pry it out of them to figure out why they were trying to gouge me. Also, this was their own private mail order pharmacy, so all the money was going to them as well.

Ask a lot of questions before paying a lot of money.

broknbottle 36 minutes ago 0 replies      
I wouldn't be surprised if this was due to issues with some generics not being the equivalent. Concerta is a weird drug with the one generic actually being rebranded brand name. The other generics are not considered equivalent by the FDA but the last time I checked some pharmacy such as Walgreens carry the non equivalents. Apparently they are tied up in a court case.


zamalek 1 hour ago 1 reply      
My sister had an acne breakout in her teen years and went on a brand-name hormone suppliment - it worked wonders. She went on the generic and only had problems - so far as I understand the situation the delivery mechanism is different across brands; though it's very rarely a problem for most people. As always, it's your health at stake - if you feel as though something is not working for you then switch.
tzs 1 hour ago 3 replies      
I wonder if the guy in the article has considered buying his drug without insurance? In the case of the generic for Adderall XR it is about $70 with a GoodRx coupon at Walgreens, which is less than the $90 copay to get the brand drug with insurance.

Even if your insurance company lets you use generics, it is a good idea to take a look at GoodRx, and take a look at Walmart.

I've had generics where through my insurance my out of pocket was a $20 copay, but when I checked GoodRx there was a $12 coupon. I've had other generics, again with a $20 copay if I got them through insurance, where they were $4 at Walmart with no insurance or coupons.

Sindrome 44 minutes ago 0 replies      
My first software job out of college was working for a healthcare company. I worked in the Drug Comparison team (which consisted of 3 people) finding people ways to save on prescriptions by beating the system using software! Yay, helping the world and stuff.

Actually, we ended up selling out and building systems that benefited healthcare provider's formulary plans. For example, not recommending generics, which was most of what we got paid 500k+ per contract to do.

We did do some amazing work on Medicare Part D stuff, tho. We saved some elderly people tons of money by algorithmically reccomending them the right drugs at cheaper cost.

Spooky23 19 minutes ago 0 replies      
I monitor my blood pressure pretty closely, and noticed when my blood pressure medication changed between generic manufacturers, one particular manufacturer was less effective, and my pressure went up 10-15%.
kindawinda 23 minutes ago 0 replies      
The insurers often have PBM contracts that incentivize them in filling specific drugs. This is the most likely reason https://en.wikipedia.org/wiki/Pharmacy_benefit_management
colordrops 49 minutes ago 0 replies      
I have to take levothyroxine and my endocrinologist insists that I take a particular name brand. The amounts are in micrograms and the dosage varies across brands, and since various pharmacies use different generics, the only way to get consistency is through sticking with a particular brand.
mason55 59 minutes ago 0 replies      
Same thing for me. When I get refills I see whichever psych happens to be free that day and they always comment on how weird it is that my insurance requires the brand name. I always tell them that I assume the insurance company has cut a deal with Shire and it appears I was right.

But my copay for the name brand is only $20 so it never occurred to me to complain.

0x10101 1 hour ago 0 replies      
Some of this insanity is covered nicely in the econlog podcast[1] on the book Drug Wars[2]. The NY Times always wants to throw all of the blame at insurers, but they are trying to save money. The generic system is a Kafakaesque set of regulations and processes.

[1] http://www.econtalk.org/archives/2017/06/robin_feldman_o.htm...

[2] https://www.amazon.com/Drug-Wars-Pharma-Raises-Generics-eboo...

Jeff Deans Lecture for YC AI [video] ycombinator.com
227 points by danicgross  8 hours ago   32 comments top 5
deboflo 2 hours ago 1 reply      
Once, in early 2002, when the index servers went down, Jeff Dean answered user queries manually for two hours. Evals showed a quality improvement of 5 points.
iandanforth 6 hours ago 1 reply      
The notion of running one giant model that has many sub-talents is epic. I can imagine that all the disparate models they run today could fuse into a giant network that melds predictions and guides computation as required by the task. That seems like a very Jeff Dean scale endeavor.
hallman76 32 minutes ago 0 replies      
As a ML enthusiast, this is incredible to watch!

I'm completely blown away that Google was working on full-scale physical architectures that were optimized for these problems. Talk about being two steps ahead of the game!

sputknick 6 hours ago 7 replies      
If Tensorflow becomes the default library for Deep learning, is this a good thing or bad thing? Does it help in that all researchers can focus on what's important (the data and results) or does it hurt in that Google now controls an important paradigm for the next generation of computing?
bluetwo 5 hours ago 6 replies      
If a doctor misdiagnosis an eye ailment, they might end up with a malpractice lawsuit. If an ML program misdiagnoses an eye ailment, what is going to happen?
The BitTorrent Protocol Specification v2 bittorrent.org
98 points by Nekit1234007  5 hours ago   31 comments top 7
jzelinskie 3 hours ago 3 replies      
This update to the spec is a modest change that's largely a preemptive reaction to SHA1 being broken; large portions of BitTorrent are designed around the 20-byte length of a SHA1 checksum. They've decided to move forward with SHA256 truncated to 20 bytes to avoid incompatibilities with existing infrastructure such as the Mainline DHT.

Beyond the hashing algorithm, some important additions that were previously proposals without widespread use (e.g. merkle tree for hashing pieces) are becoming required. The focus has mostly been on optimizing latency for the P2P protocol and making sane improvements to the file spec. I feel like trackers were largely overlooked in this update, but I'm biased because I work on a popular tracker.

Ideally, BitTorrent would be broken down into separate specifications that could be used together or in separate systems: one for the file format and piece representation for sharing files, one for the P2P protocol, and one for discovery (trackers, DHTs). I want to believe that there would be far more interesting P2P projects if you could just lift robust primitives from BitTorrent.

Scaevolus 3 hours ago 3 replies      
1) Chunks don't span files. Each file is validated by the hash of its merkle tree. This is the biggest user-visible change, since it means you can download one file without downloading others.

2) SHA1 is replaced with SHA2-256 (2x longer hashes and not broken).

3) Files are represented by a tree structure instead of a list of dictionaries with paths-- this reduces duplication in deeply-nested hierarchies.

4) Backwards compatible-- you can make a .torrent file with both old and new pieces, and a swarm can speak either. This requires padding files from BEP47, which most clients probably don't support.

Per-file metadata increases pretty significantly, from ~19B (just length) to ~68B (length + hash).

richdougherty 3 hours ago 0 replies      
smegel 31 minutes ago 0 replies      
Pity we will never see a genuine version of uTorrent that will support it. That was a real loss.
0x0 3 hours ago 1 reply      
What's the stuff about "proof layers", is that new in this v2? The paper briefly talks about proof layer requests. Is this something merkle-tree related? What is the purpose? Is it to prevent clients from lying about having pieces they do not have by requesting a verifiable random hash chunk?
lowglow 4 hours ago 2 replies      
Can someone diff the spec from the previous version? What's the changelog? :)
shmerl 3 hours ago 1 reply      
Do all Bittorrent clients support it already?
Ferret A free software Clojure implementation ferret-lang.org
95 points by greydius  5 hours ago   26 comments top 10
nathell 3 hours ago 0 replies      
It's very notable in that it's literate Clojure.


christophilus 3 hours ago 0 replies      
Discovering this and [Clojerl](https://github.com/jfacorro/clojerl) within a week of each other is like Christmas come early.
jwhitlark 46 minutes ago 0 replies      
dantiberian 1 hour ago 0 replies      
Does anyone have any insight on the legal status of this project? Its licensed as BSD 2 Clause, but looks to be heavily derived from Clojure which is EPL. AFAIK the EPL is copyleft and doesnt permit relicensing without the copyright holders permission.
billsix 58 minutes ago 0 replies      
I've been looking for a language which provides Software-Transactional-Memory with persistent data structures which fits in 2KB RAM on a microcontroller.
gleenn 3 hours ago 5 replies      
Seems like a cool project if you already know Clojure and hate the start up time, but also seems like it sidesteps one of the main benefits of Clojure which is Java interop obviously... Which Clojure has to do a fair number of handsprings to accomplish. Begs the question, why pick Clojure as the lisp to compile to C++? Why not Racket or Common Lisp etc?
physicsyogi 2 hours ago 1 reply      
Does this have Clojure's persistent data structures? The docs say it has immutable data structures, which isn't exactly the same thing. (Edit: added the second sentence)
davidgrenier 3 hours ago 3 replies      
Always look at the memory management story when you stumble upon something like this.
comex 3 hours ago 1 reply      
Viewing the site in iOS Safari, I get several pages of what appears to be random binary gunk rendered as HTML. Something wrong with compression headers?
kbuchanan 3 hours ago 2 replies      
Can anybody tell whether it supports multimethods and protocols?
In Earth's hottest place, life has been found in strong acid bbc.com
94 points by happy-go-lucky  7 hours ago   34 comments top 10
knoepfle 6 hours ago 3 replies      
"Pure acid" isn't quite right; it's an aqueous solution with a pH of zero. Zero isn't a special value. You can have negative pH (cf. http://pubs.acs.org/doi/pdf/10.1021/ed083p1465).
SubiculumCode 3 hours ago 1 reply      
What a pleasure it is to have this forum where people can zero in on loose language and logical fallacies. I do not mean this sarcastically in the least.
the_common_man 3 hours ago 1 reply      
Is 45C such a big deal? Chennai (India) routinely gets such temperatures and humidity is like 95%. Look no further for a place teeming with life ;)


slaveofallah93 5 hours ago 0 replies      
"Pure acid" is a pretty meaningless description. Water is a weak acid (and also a weak base) so in a sense ordinary water could be described as "pure acid".
branja 6 hours ago 2 replies      
If you fell into 100C hot and extremely acidic water it would be a big problem Barbara Cavalazzi, University of Bologna
Infernal 7 hours ago 1 reply      
In case you, like me, doubted the use of the word "pure" in the headline:

> The team found life in a pool where the acidity was measured as zero pH.


EDIT: Nevermind, someone with a better understanding of pH chimed in: https://news.ycombinator.com/item?id=14951312

danso 4 hours ago 0 replies      
Was expecting this to be another article about yet another environment in which tardigrades have been discovered to survive http://www.nytimes.com/2008/09/16/science/space/16obvacu.htm...
ridgeguy 4 hours ago 0 replies      
The article suggests a considerable degree of physical danger in walking around to collect hot pool water samples. This would be a good use case for drone-based collection, at least for surface samples.
conistonwater 6 hours ago 0 replies      
Isn't one of these lakes the same as the one that Periodic Videos talked about recently (https://www.youtube.com/watch?v=Uaj722cg9Gk)?
21 5 hours ago 3 replies      
So is this 0.2 pH more corrosive/stronger than sulfuric acid or not?
Show HN: How much your home could be worth on Airbnb? eliotandme.com
145 points by edouard1234567  8 hours ago   70 comments top 22
edouard1234567 4 hours ago 1 reply      
For everyone wondering how much the White House could rent for on Airbnb (132 bedrooms/35 bathrooms). $157,920/week


philpee2 6 hours ago 0 replies      
Airbnb already supports this https://www.airbnb.com/host/homes
buf 7 hours ago 2 replies      
I found this to be really inaccurate. I've been a host of an Airbnb for a few years. I've slowly moved my price up until I hit the maximum 70/day. I have hundreds of positive reviews.

The price the website suggested was 120.

I then went in to put an obscure address where there is no demand, and the price was $140/day. I've rented a hotel in the area for $30/day.

soulnothing 4 hours ago 1 reply      
For my own reference point in Wilmington DE. It was saying daily 130 for home, 55 for room. I rented out rooms at 15$ a night. I was constantly told I was overpriced for the area. Despite providing free weekend breakfast, and Friday night dinners.
Brendinooo 1 hour ago 1 reply      
How is my data being used here? Not super comfortable giving my address and then giving information about the house when there's no information about how the data is being handled/stored.
yellow_postit 6 hours ago 1 reply      
How is this "Artificial Intelligence" (from the about page: https://www.eliotandme.com/about)?
ramphastidae 6 hours ago 0 replies      
Looks great. Consider adding dishwasher and washer/dryer as factors too I found those made a big difference (20%+) when doing my own comps research.
zeep 2 hours ago 0 replies      
Is that how much you could get if you are aiming to getting close to 100% occupancy?
akcreek 6 hours ago 1 reply      
Gave me the exact same rate for my current apartment (1600sq ft Capitol Hill Seattle) and my previous condo (1100 sq ft NuLu Louisville, KY). My rent here is 3.5x what my mortgage was in Louisville and from personal experience in both places the demand for short term rentals in Louisville is nothing like it is in Seattle. So yeah... probably not accurate for the addresses I checked.
jimmyrocks 7 hours ago 0 replies      
This could also be used when looking for an Airbnb. You could use it to get a rough price estimate in an area near your destination.
dogan 1 hour ago 0 replies      
Seems accurate, might be better to consider current rating, amenities, etc.
drcongo 6 hours ago 4 replies      
Apparently Trump could get $1,150 per week letting out the White House.
kchoudhu 4 hours ago 1 reply      
There's no way a 4BR/3B house in Las Vegas (Summerlin, to be exact) is going for 1500 a week.

Nice try, though.

thyselius 6 hours ago 0 replies      
Quite accurate estimate on our apartment in Stockholm, Sweden (we rent out sometimes at a slightly higher price than estimated)
nick-cortes 7 hours ago 3 replies      
Curious to see how it relates to the mortgage of some houses.
Fuzzwah 7 hours ago 1 reply      
Hobart, Australia: the estimate from this page for 1 week of renting, would cover a monthly of an interest only mortgage.
muninn_ 6 hours ago 1 reply      
Cool idea, very inaccurate.
borne0 6 hours ago 2 replies      
question for people who use airbnb: If you've had a good experience with a host and you're going back to the area would you bother booking through Airbnb or just contact the host directly?
lwansbrough 4 hours ago 2 replies      
Shouldn't it be EliotAndI.com?
jdlyga 5 hours ago 0 replies      
If only it was legal in my state!
itsrishi 6 hours ago 0 replies      
I know some superhosts who use airdna.co to find new properties to invest in and price listings. It's not free though.
asdf33323 7 hours ago 0 replies      
awesome! well done.
A Tech Leads New Project Checklist insimpleterms.blog
154 points by LeonigMig  8 hours ago   28 comments top 7
megamindbrian 2 minutes ago 0 replies      
Here is the other one from today. I should turn this in to an app-style application for employers/recruiters to fill out before they contact me.https://www.sideprojectchecklist.com/marketing-checklist/
JackC 6 hours ago 4 replies      
"Understand the importance of the Trinity of delivery: Delivery manager, product owner, tech lead"

I'm only familiar with teams too small to have separate roles like this. How does good software planning scale down to smaller teams -- say 5 or 10 people in the whole organization? Ultimately someone has to be responsible for the same concerns, but I wonder how it maps.

In general I'd love to see a comparison of software teams at different sizes. What are the key, identified roles in a company of 5, 50, or 500? What are habits that smaller organizations ought to borrow from larger ones?

Domenic_S 8 hours ago 4 replies      
Whats the budget and the value proposition? should be #1. Projects without a clear purpose (happens way more than you might think) are sinking ships you've got to get away from.
tomcam 7 hours ago 0 replies      
My primary requirement is a demonstration that the site can be torn down and recovered any time you never know when you will have a DNS problem or someone take over your domain
anotherevan 4 hours ago 0 replies      
Unrelated minor rant: Pocket is hopeless when dealing with bullet points. It so often skips list items, especially if they have links in them. I understand that they are trying to avoid including navigation stuff, but overzealous way too often.

I wish I could integrate Instapaper with my Kobo ereader instead.

dbg31415 3 hours ago 0 replies      
Ask about approved licenses and also make sure you know what you've purchased when it comes to platform service levels.
throwme_1980 6 hours ago 5 replies      
here is what i usually do:

-identify weak people in the team with a stinky attitude, those who dont learn and are toxic, do your best to get rid of them.

-next identify those mediocre people doing 9-5, maximise their output during those hours, don't give them no slack.

-finally identify your super stars, cherish them, buy them coffee/lunch and give them a lot of slack.

for this is a meritocracy, no damn Disneyland.

Google Fires Employee Behind Controversial Diversity Memo bloomberg.com
557 points by QUFB  2 hours ago   713 comments top 88
sctb 2 hours ago 0 replies      
We've closed this thread to new accounts because of trolling. If you're going to comment here, please take extra care to make it civil and substantive.
marcoperaza 2 hours ago 25 replies      
The author's arguments have been completely misrepresented. He pointed out widely-believed and sometimes scientifically-established differences in the DISTRIBUTION OF traits in men and women. He said that those differences make attempts to achieve numerical parity misguided, discriminatory, and harmful. What is his conclusion about how we should behave? "Treat people as individuals, not as just another member of their group." Wow, what a monster.

The reaction to the memo is really the most damning thing about the whole affair. Everyone is just rushing to virtue signal, to demonstrate their own purity of thought. They've just proved the author's point. Honestly, Google might have even been rational to fire him, due to the toxic situation created by the mass outrage. How incredibly damning of our society.

A particular brand of liberalism has reached the point of being a religion, and the establishment is running an inquisition against any who dare to question its points of dogma.

This is the closing of the American mind.

zorpner 2 hours ago 14 replies      
Good. I get that HN is not into this, but this employee stated, bluntly, that they don't believe many of their colleagues should be there because of their sex. Every peer review, every no-hire, every interaction, is and should be suspect.

Happy where I am now, but future interviews will include me asking what management would do about this, and termination is the only correct answer. This is textbook hostile environment. Anyone saying otherwise should look hard at why they value this person's desire to speak without consequence over their colleague's right to a workplace where they're not judged by their sex, race, sexual orientation, or any other innate attribute irrelevant to their job performance.

adbge 2 hours ago 3 replies      
An excellent opportunity to re-read PG's essay, "What You Can't Say". [1]

"What scares me is that there are moral fashions too. They're just as arbitrary, and just as invisible to most people. But they're much more dangerous. Fashion is mistaken for good design; moral fashion is mistaken for good. Dressing oddly gets you laughed at. Violating moral fashions can get you fired, ostracized, imprisoned, or even killed."

[1] http://www.paulgraham.com/say.html

tunesmith 2 hours ago 1 reply      
People can't seem to summarize his argument without getting much of it grossly wrong, because his manifesto was a haphazard collection of good points, bad points, good arguments, lousy arguments, misrepresentations of others' views, and unstated implications. A perfect recipe for people to argue past each other about it.

It's worth remembering that one of his conclusions was to end or replace gender-based diversity programs at Google. Given that, it's easy to understand why people would be upset. If gender-based diversity programs are responsible for qualified women getting jobs that they otherwise wouldn't have gotten due to bias, then the lack of that program means those women wouldn't have gotten those jobs.

postnihilism 1 hour ago 1 reply      
This feels like the blue and black (white and gold) dress. It boggles my mind that people don't see the fundamental and toxic misogyny in this 'manifesto'. Please have a woman you care about in your life, preferably one in tech, read this and then ask their opinion of the piece.

The "treat people as individuals, not as just another member of their group" sentiment of the author is fine except that we have hundreds of years of doing just this in order to oppress and disenfranchise groups of people. Diversity programs are not about lowering the bar, they are about outreach and working against institutionalized racism and sexism which has created the distribution of wealth and education and work culture that we have today. Given the massive disparity we see in tech it's ridiculous that this individual felt the need to lambast the relatively minimal amount of work being done to foster a more diverse and inclusive culture across the industry.

If one of the things you have to deal with as a woman in tech is seeing 10 page pseudo-intellectual manifestos about your inherent inferiority at performing in technical and leadership roles published at one of the premier tech companies in the world, and then see that piece supported on the most popular tech social sites then it's no wonder we have the gender gap we see today.

When somebody's views are being attacked for being misogynistic and alienating to their female colleagues, it is not suppression of free speech and diverse political opinion it is common decency. Nobody is infringing on your free speech but they will respond. All of these cries of 'authoritarian left-wing thought-police' makes me think We need a manifesto on White Male Persecution Complex Culture in Tech.

[disclaimer: I work at Google, my words are my own and not my employers]

mehwoot 1 hour ago 1 reply      
Reading the letter, what surprised me was how political it was, framing everything as a "left vs right" cultural fight. I think if it was up to me I'd probably fire anybody on either side of that debate who started circulating shit like this. As soon as you're on that level, nothing good is going to come of it and you're just going to make a lot of people angry, which is very bad for the business in a lot of different ways.

The workplace is no place for politics like this. If you are going to strictly stick to narrow issues that are relevant to the job, then maybe, but as soon as you're writing a 10 page manifesto with phrases like "the Left's affinity for those it sees as weak" or "some on the Right deny science" or "the Marxist intellectuals transitioned from class warfare to gender and race politics" you are way out of line. It doesn't matter if you are correct or not, politicising your workplace in that way shows a stunning lack of judgement.

Waterluvian 2 hours ago 0 replies      
I believe that equal opportunity among all persons is manifest. But I think the manner in which we encourage/enforce/promote it, especially in the workplace, demands debate. We must be able to disagree with the methods without any question of our belief of its purpose.

The most disgusted I've ever felt at a workplace was when my boss, prior to me interviewing someone, said, "and she gets an extra point because she's a woman." The government punishes us on research tax breaks if we don't have enough diversity.

So what am I supposed to say to this person when she asks what set her apart from the other candidates to be hired? "No you weren't the best candidate, but you are a woman." That disgusts me and I refuse to do it.

My job is not to balance an arbitrary math equation that x% of engineers are supposed to be women. That's an issue far larger and more systemic. It can't be fixed this way.

What I can do is remind myself that my job is to pick the best candidate, and my definition of "best" may be fraught with bias, so I need to be exceptionally perceptive to question what capabilities each candidate might bring to the job that I don't naturally consider to be ideal.

Apologies for the ranting nature of this comment. I feel frustrated when faced with the reality that what the establishment wants are at such odds with my morals and convictions.

hardwaresofton 2 hours ago 3 replies      
Yep, after reading the public response provided by Google's new Vice President of Diversity, Integrity & Governance (https://gizmodo.com/exclusive-heres-the-full-10-page-anti-di...) I was pretty sure this was all that was left.

Regardless of whether you agreed with the letter or not, it's 100% correct in asserting that it's super difficult to have productive, rational conversations about the issue of diversity. Google just reinforced that fact.

RealityNow 1 hour ago 0 replies      
It's incredibly ironic that a company that apparently cares about diversity enough to have a "Vice President for Diversity" fired an employee for presenting an opposing viewpoint - to their diversity policy of all things.

Further evidence that Google and the other large institutions don't actually give a rat's ass about "diversity". Diversity has absolutely nothing to do with diversity of thought, and is only concerned with normalizing racial/gender composition to present the illusion that discrimination and biological variation are non-existent (except for discrimination against males, whites, and Asians, because for some reason it's not considered discrimination if it affects them). It's not about doing the right thing, it's about PR - hence the decision to cave in to whatever the pitchforks are demanding.

It's a shame that honest criticism of diversity policies and gender issues is considered taboo enough to get someone fired. If we really cared about diversity, then we'd welcome opposing viewpoints and counter them with facts, not silencing. We have a culture of anti-intellectualism and dogmatism around certain topics that for whatever reason are considered sacred and not allowed to be challenged (ie. political correctness), and its disgraceful.

Relevant video: Peter Thiel: What is Multiculturalism Really About? [1996] https://youtu.be/E6cxRYgqfHY

escape_goat 1 hour ago 2 replies      
The best explanation of why he was going to be fired and why that was the right thing to do (that I read) came from Yonatan Zunger, who had recently left a Human Resources position at Google; especially his third point, in the following post on Medium:


In many way, it was not an unfamiliar sort of rant for the internet, and I was struck by the author's earnestness and apparent sincerity. It's possible that I make too many allowances for behaviour, and it's possible that I'm easily mislead.

I thought his ideas were not in any way useful, or actionable, even had he been correct;

That the ideas were poorly expressed, and full of such fringe 'truths' as are derived, insincerely, from cherry-picked science in order to be sold as snake-oil cures for the cognitive dissonance of the conservative and vulnerable;

That the expression of these ideas was immensely foolish, especially appearing in the context of what I took to have been his initial intent, an appeal of tolerance of diverse viewpoints at Google.

However, I'm always most disturbed by vitriol online when it's in service of beliefs that I share. It makes me deeply uneasy.

drawkbox 1 hour ago 0 replies      
It was a bit of a Jerry Maguire moment even if you agree with some of it.

At work you should be professional and not only work to make yourself and the company/product better, but the people around you better.

It is mostly not healthy to get political or ideological at work unless you want to divide people. A company and employees really shouldn't get political if at all possible, to prevent a divided customer base. You should treat everyone at work like a client and not go all tribal or into cliques that end up in groups that are constantly complaining.

If you don't like something you can work to change it but in a company the size of Google that is not always possible. If you don't like the ideology of a company then leave. Do a good job yourself, setup your own thing if you want to control everything, but don't bring down other individuals, encourage them and make them better, create respect internally. Don't rock the boat, try to guide it, if you can't, hop on a new ship. In the end we live in a free country but work for companies that are more dictatorial/authoritarian where we are just sharecroppers on their feudal land. Companies are not democracies unless they are really small and even then they are not.

Sometimes manifestos are needed [1] but for the most part it is like calling out an employer. Not only will it probably not change the company, it will follow you around for better or worse.

Oh, theres one final lesson: you never know when something you write is going to unexpectedly be published in the Wall Street Journal. So watch those split infinitives.

[1] http://www.businessinsider.com/what-i-got-wrong-in-the-peanu...

Afforess 2 hours ago 4 replies      
I read the memo and didn't find it particularly persuasive; but this dismissal does further its core point. It's a bit tone deaf of Google to fire an employee concerned about groupthink.
bronz 2 hours ago 2 replies      
the author of the memo is completely correct and, as others have said, his memo is drastically misrepresented in the coverage it has received. he simply points out that women could have less desire to go into fields such as cs because of their biological makeup. he never says that women should leave google -- he only says that trying to reach a perfectly equal distribution is misguided. perhaps those who are red in the face while reading the memo failed to read between the lines and realize that under this logic, the women who do find themselves wanting to go into cs for passion and not just money should be welcomed. he never once called for anything other than the policy of acceptance and wide open doors that is currently in place. he is merely suggesting that maybe a perfect 50/50 distribution is not needed and, more importantly, liberal people in general need to snap out of the double think where we believe men and women are the same. men and women are not the same.

as a liberal i find that it is important to not shy away from controversy: women and men are biologically different. i am sorry, but they are. you dont need to reject this fact in order to treat women with respect and equality.

wyclif 2 hours ago 11 replies      
Fun fact: the author of the memo earned a PhD in Biology from Harvard. Google can fire him and get away with it, but it's going to be tough, very tough, for him to be falsely characterized as someone who is misinformed or unscientific. And the court of public opinion may well be more important in this controversy than any court of law.
jrs95 1 hour ago 2 replies      
This is disappointing. I didn't really agree with his premise, and I think he was generally wrong honestly, but I don't think his views were so unrealistic or offensive that this was warranted. I think this was probably done mainly because there was negative media about it. Which sort of demonstrates that Google is just another corporation that doesn't really value its employees that much. Maybe as a collective, but not on an individual basis. I think it's important not to let their moral posturing about social issues cloud our judgement about that.
ViktorV 54 minutes ago 0 replies      
What I don't get in many of the left-leaning comments is that they're basically saying that 'patriarchy'/society is the only reason why women are underrepresented in tech.

In India/Russia, there are way more women represented in tech then in US. Do you think that these cultures are more welcoming/less sexist to women than US? It should be so in your world view.

I suspect that the companies in these countries doesn't have a diversity program, and they hire everyone for purely business reasons. Why do you think that diversity problems solve anything on the long term?

I live in Eastern EU and many people are actually sexist here ( men and women are favor of enforced gender roles ), unlike in Top 20 countries. But I suspect the gender disparity in tech is better than in the us, or largely the same. Isn't this kind of disproves the notion that gender differences are caused by sexism?

jorgemf 1 hour ago 1 reply      
From the code of conduct of google [1]:

> Equal Opportunity Employment

> Employment here is based solely upon individual merit and qualifications directly related to professional competence.

How are you going to achieve 50/50 diversity in the company if the pool of candidates is far from that distribution? (unless you go and don't hire the best ones to balance the distribution)

[1] https://abc.xyz/investor/other/google-code-of-conduct.html

EDIT: As kevingadd said, I might be wrong assuming the goal for diversity target for Google. It can be something more realistic and be close to the ratio of the pool of candidates. In that case I am wrong in my assumption. This is also a good link he provided: https://www.theguardian.com/technology/2017/aug/07/silicon-v...

williamaadams 3 minutes ago 0 replies      
When I read this manifesto, I thought "well, this is going to generate a lot of news".

What I did when I read it was substitute "woman" for "black", rolled back the clock to Jim Crow south, and read it again.

That might be an instructive exercise for anyone who hasn't actually experienced discrimination, or rather, primarily lives in the dominant side of any culture.

I would rather see this stuff out in the open, rather than hidden behind the glares and glances of colleagues. His firing is unfortunate, but totally understandable. This dialog could have ocured in a much better format.

In the famous words of Bill and Ted, "Be excellent to each other..."

donohoe 1 hour ago 0 replies      
If you haven't read this post by Yonatan Zunger (former Googler) that deals with this specific situation, then I suggest you do:


It makes the clear case that this employee went beyond opinion and created a harmful situation for himself and his colleagues.

jorgemf 2 hours ago 2 replies      
> he had been fired for "perpetuating gender stereotypes."

Of course, because a policy of positive discrimination for minorities is not perpetuating any stereotype...

Equality doesn't mean equity. Equality is not fair.

P.S. downvote whatever you want, it is called discrimination for a reason (even if it says positive before)

maxxxxx 2 hours ago 8 replies      
This is starting to be scary. Instead of government companies are acting as thought police. I think a lot of people will watch what they are saying publicly from now on.
peakai 1 hour ago 0 replies      
This only darkens the landscape surrounding gender issues. Considering Google's flip-flopping from the initial response and their actions after the public backlash- it only seems to confirm that internally there is much division over how this is to be handled on an organizational level. A step backwards for the culture as a whole, if we cannot get the best minds together to tactfully resolve such a sensitive topic.
thesmallestcat 1 hour ago 1 reply      
Maybe Google fired him for being an idiot as they value smarts? Unless you're trying to get into politics, you have to be pretty dumb to publish something like this under your own name, let alone distribute it at your job. Not a big surprise.
thex10 20 minutes ago 0 replies      
As a minority woman in tech I'm not bothered much by his document, but that's only because I don't work with this guy or folks like him. His manifesto doesn't pain me, but that's only because I have a secure job coding all my favorite things with people who don't spew this kind of unsubstantiated crap to my colleagues, giving them spurious reasons to doubt my abilities and inherent qualities.

My sympathies are with those who work in more precarious situations. I'm no authoritarian, but I'm ok with this guy getting fired - he used his platform unwisely and at this point the damage control gained by firing him probably outweighs whatever benefit the company might get by keeping him.

However, I'm plenty perturbed by the willful lack of understanding all around these parts about (at the very least)- how diversity programs work / what they do- how hiring works- how oppression works- how public words can affect others

notliketherest 36 minutes ago 0 replies      
We value diversity of thought. As long as it's been pre-approved by the Thought Police in HR. sorry Google, you can't have it both ways. To those thinking about leaving, trust me, you'll never look back.
nancye 21 minutes ago 0 replies      
I'm not one to root for people to be fired for holding what I'd consider dumb or ill-informed opinions, but at the same time, as a woman, do I actually want to work on a team with someone who thinks I'm biologically unsuited to the job I'm doing and who is presumably judging my work more harshly and waiting for me to fail at things, who may have the power to review me poorly and affect my long-term career prospects? Gee, that's a tough one. If you've never had to work all day, every day at an office with people who treat you like that, or had to worry about being judged not on how hard you work but on who you are, be thankful. And that's obviously not a great look for a company that's already being investigated for discrimination, so I'm not sure they really had much of a choice here. Bringing bad PR to your company and potentially exposing them to legal liability usually doesn't end well.
peacetreefrog 1 hour ago 0 replies      
It's crazy to see such overtly anti free speech arguments in the discussion around this memo, even on Hacker News.

"Well actually, the first amendment only protects against government intruding on free speech, it doesn't apply to the rest of us!"

Free speech used to be the thing everyone agreed on. It's like a fundamental tenant of Western, liberal democracy has gone out of vogue.

nsxwolf 2 hours ago 1 reply      
Um, yeah... I'm just gonna go ahead and not have an opinion on this one.
LouisSayers 1 hour ago 1 reply      
Where is Google's rebuttal of this memo? All anyone is saying is "it's wrong" etc, but where is an ACTUAL breakdown of what was incorrect about what he wrote?

I feel like most "rebuttal's" are like listening to Donald Trump. "He's wrong and we're right. We're definitely right.".

If you do have a link, please share!

KKKKkkkk1 2 minutes ago 0 replies      
This individual apparently has a biology PhD from Harvard and he has only two published papers. How common is this in biology? Seems very low compared to computer science.
bangbang 1 hour ago 0 replies      
So much of the discussion is about what he said has any truth. I think this is beside the question. He Wrote and distributed materials criticizing his employer's core HR mission while alienating coworkers.

If you're a person targeted in his manifesto, I'd suspect you'd no longer want to work alongside him. That's reason enough to fire him on the spot. He's creating a hostile work environment via coworkers and media.

For those that doubt it, I say try it at your job and see what comes of it.

Honestly, what did he think would happen?

danso 2 hours ago 0 replies      
musesum 7 minutes ago 0 replies      
coss 1 hour ago 0 replies      
I thought at the very least it was an attempt at a well thought out unpopular opinion that should have at least been met with a productive rebuttal. To fire this guy though, sort of proves his point. You can't speak out against the safe space google is trying to create, without fear. If he's wrong then argue his reasoning. Don't attack his character and silence him.
perfectstorm 5 minutes ago 0 replies      
honestly I haven't read the original post/email/letter. I'm sure most of the keyboard warriors you see on here as well as Blind or any other forums haven't read the whole thing either.

I have a feeling that people jumped into conclusions before reading it or understanding author's POV. I'm frankly surprised by Google's reaction. They should have debated this internally instead of firing an employee because h/she expressed their opinion. Even if the author was wrong, they could've countered his/her points by bringing proofs/scientific studies.

sidcool 7 minutes ago 0 replies      
I strongly feel this was a PR move by Google. I read the article and although I disagree with many of the points stated, firing the person was an extreme step. It would set a wrong precedent for anyone trying to speak up against the majority. A behind the door warning and a rap on wrist would have been enough.
plainOldText 11 minutes ago 0 replies      
Does anyone know why isn't this story higher in the news feed? For the past two hours it has had over 500 upvotes and over 600 comments and it's still at position nine.
bronz 1 hour ago 0 replies      
i absolutely hate the situation that we have found ourselves in. gender and race politics have become totally divorced from the issues of gender and race. firing people who speak their mind is not going to help women in tech and it wont help black kids from being shot in the street in chicago or oakland. not a single person i have ever known who played the politically correct game has ever lifted a single finger to help black people or a little girls interested in tech. the sad truth is that gender politics is now a game of virtue signalling -- a status game among liberal people to see who is the most virtuous, the kindest and most thoughtful. meanwhile, any person who does not fit into a beneficiary position in this game (for example the entire middle section of the united states) is treated with savage cruelty. it is astonishing to see my ultra liberal friends whine endlessly about how unfair life is, how painful life is for certain people, and then turn around and in the same breath condemn and disparage millions of people for absolutely no reason. the truth is that a truly kind person reserves kindness for everyone, and even someone who might be mean or unpleasant at first -- i have found that kindness is much more effective at changing peoples perspective than anything else, so if you want mid-westerners to stop being racist, remember that being horrible and mean to them probably wont make them see things from your perspective.

i think there is some kind of effect where people who are really good at solving complicated puzzles that are right in front of their faces are not so good at solving logical issues that are less tangible or more long term. thats one of my theories, because a lot of people i know that arent dumb at all seem to totally buy into the gender politics thing, even though their beliefs totally fly in the face of logic. maybe its me who is wrong, and there is some subtle aspect of this whole thing that i am not grasping. either way, we are all in this together. its important to be patient with each other and to never allow ourselves to descend into savagery and hatred. and if you do find yourself behaving like that, its ok. we all make mistakes, dust yourself off and try again. i know i have.

mikeash 1 hour ago 0 replies      
It's strange how people who say that certain groups are inherently better at something are invariably a member of that group.

To put it more bluntly: it's strange that the people saying white men are just inherently better at stuff are always white men. Oh well, I'm sure it's just a coincidence.

When there are real, strong differences, usually a diverse group of people are willing to accept it. Plenty of women will tell you that men are better on average at lifting heavy objects or reaching the top shelf.

But ask about computer programming, and it's invariably the supposed "in" group saying they're better. What a strange coincidence.

plainOldText 1 hour ago 0 replies      
Somewhat related to the issue at hand and representative of the current, very charged landscape on social issues, is this video: https://www.youtube.com/watch?v=Gatn5ameRr8

The speaker makes the case that current social conflicts at many American universities though probably representative of the general population as well arise because people are pursuing two, potentially incompatible goals: truth and social justice. I think it's a highly informative video.

Of course, I'd love to watch other videos contradicting or challenging this one, if anyone can provide some links.

dontnotice 2 hours ago 1 reply      
I think managerially this is the right move; the guy created chaos and had to go.

I'm not sure that's the right move in any other respect, it certainly doesn't advance the conversation.

0xBA5ED 2 hours ago 1 reply      
fired from a tech company for "perpetuating gender stereotypes"...

What in the world is happening right now? This is absurd beyond reason.

jannettee 1 hour ago 0 replies      
Dude gets fired for violating company policy, people have been fired for much less. Why is everyone getting their panties in a knot? Why isn't this post being flagged like all the other ones about the manifesto itself?
trentnix 36 minutes ago 0 replies      
They decided he was a witch, and he was burned at the stake. That's it in a nutshell.

They won't much enjoy a world where these new rules they've created might apply to them.

thex10 1 hour ago 0 replies      
Well, I'm not the least bit surprised - you don't just get to spam your colleague network with your non-work-related hot-button-topic nondeliverable and get anything productive out of it.

If he's actually interested in the topic of working towards better workplace diversity, he could read a book, write some papers, put together a panel discussion, organize a conference, make an educational video...

jmorphy88 1 hour ago 0 replies      
1) why is it a moral imperative to have 50/50 gender balance in a company?

2) why is it wrong to have a non-diverse company?

omot 1 hour ago 0 replies      
I'm going to invoke Occam's razor here. Let's simplify diversity and just look at height. Let's assume men that are short and men that are tall are biologically equivalent, meaning biological trait that might surface are evenly distributed. Now let's inject 100 men with varying height into the general population, and... Uh oh there's a skew in leadership position and income. Taller men have better positions and have higher income.

People who are taller all have tall coworkers. They work for a boss that's a little bit taller than them. Some of them start thinking that you know this is probably just natural, men who are shorter are just biologically unsuited to be a leader. They aren't aggresive enough and they just don't have as much drive, it's probably just written in their DNA.

The short men no matter how much they perform or how brilliant they are always seem to be sidelined for promotion. Some of them make it pretty far but they're performing 50x compared to their peers, and theyre always sidelined when it comes to executive promotions. Other executives think: "this guy is brilliant but what would people think about us... We better promote the other less brilliant tall guy. We could retain investor confidence."Some of them break out and try to start a company. They can't get any funding, and no one wants to join their company. People think its a company run by a short guy, this guy is brilliant but he's not going to do well in the long term, so they end up joining start ups that have tall guys.

This network effects over a million times.

Now let's take people's perception, and assume people perceive men and women exactly the same. The catch here is that women are one standard deviation shorter than men. Just from height you'll see a discrepancy between men and women representation in leadership positions.Let's end height discrimination first.I believe this is a simpler explanation of the discrepancies in representation.

mc32 2 hours ago 1 reply      
I feel Google cornered itself into this pickle. On the one hand they need to grow their pool of talent and make the non majority to feel comfortable --which makes those not in the majority feel they have more than average support from Management, who in turn turn a blind eye to anti-majority bashing.

On the other going above and beyond for underrepresented groups will eventually grate against the majority when they ham-fistedly overcorrect.

This engineer likely got a nice severance or will get his due in court. Seriously, you don't fire someone for having a non milquetoast opinion.

I mean, I expect this heavy handedness from a Walmart but not from a Google. There are other more finessed ways to get your corp culture disseminated.

Off topic, but not having a union allows Google to fire these offensive employees, unlike School districts who must retain people who grind their own little axes.

deepnotderp 19 minutes ago 0 replies      
Wow. I have so much to say about this, but I'd rather not be publicly crucified for this.

Chilling indeed, so much for free speech and rational discussion.

CPAhem 1 hour ago 0 replies      
Dare I say, but people should be employed for their abilities not their gender, race or value to some checklist.

Those who develop greater abilities through study should find greater opportunities in work, regardless of their gender, race, sexual orientation or negative score on some diversity checklist.

sna1l 2 hours ago 7 replies      
Not a lawyer, but will he have a case to sue for wrongful termination? I guess without seeing the employee contract and Code of Conduct, it is hard to say.
kozikow 1 hour ago 2 replies      
Suppose that you work for the company that consists of 80% vimers and 20% emacsers.

All documentation is geared towards vimers. There are interests groups geared towards vimers. No one questions you if you are a vimer. If you are a stuck with some error as an emacser there are fewer people to turn to or discuss your favorite text editor, so some people start to suggest that emacsers are worse programmers.

The company analyzes the research and decides that vim and emacs programmers are equal. If the company decides to invest into supporting emacsers, by requiring more documentation to be written in emacs and funding emacs meetups, it is not a discrimination against vimers. The company recognized that if it won't offset some benefits vimers have they would continue to lose out on a group of talented programmers.

wolco 1 hour ago 0 replies      
This is the start of google become less technical and a less desirable place to work.

That might be okay for them. They had too many applications anyways

sergiotapia 1 hour ago 0 replies      
I hope he sues Google for wrongful termination and gets millions.
megamindbrian 42 minutes ago 0 replies      
I guess I won't be working for Google anytime soon.
cratermoon 41 minutes ago 0 replies      
This essay, from a former Google employee, touches on the themes that go to the heart of why supporters of this guy are flat-out wrong in their take on the situation. Tolerance is not a moral precept: https://extranewsfeed.com/tolerance-is-not-a-moral-precept-1...
chiaro 1 hour ago 0 replies      
A relevant kid's story:

The Racist Tree

By Alexander Blechman

Once upon a time, there was a racist tree. Seriously, you are going to hate this tree. High on a hill overlooking the town, the racist tree grew where the grass was half clover. Children would visit during the sunlit hours and ask for apples, and the racist tree would shake its branches and drop the delicious red fruit that gleamed without being polished. The children ate many of the racist tree's apples and played games beneath the shade of its racist branches. One day the children brought Sam, a boy who had just moved to town, to play around the racist tree.

"Let Sam have an apple," asked a little girl.

"I don't think so. He's black," said the tree. This shocked the children and they spoke to the tree angrily, but it would not shake its branches to give Sam an apple, and it called him a nigger.

"I can't believe the racist tree is such a racist," said one child. The children momentarily reflected that perhaps this kind of behavior was how the racist tree got its name.

It was decided that if the tree was going to deny apples to Sam then nobody would take its apples. The children stopped visiting the racist tree.The racist tree grew quite lonely. After many solitary weeks it saw a child flying a kite across the clover field.

"Can I offer you some apples?" asked the tree eagerly.

"Fuck off, you goddamn Nazi," said the child.

The racist tree was upset, because while it was very racist, it did not personally subscribe to Hitler's fascist ideology. The racist tree decided that it would have to give apples to black children, not because it was tolerant, but because otherwise it would face ostracism from white children.

And so, social progress was made.

lettergram 1 hour ago 0 replies      
What's crazy, if these roles were reversed... the ACLU would be all over it.
ameister14 2 hours ago 1 reply      
They were stuck between a rock and a hard place and chose what was less likely to effect ongoing litigation and what most people in Silicon Valley will accept. Other people were threatening to quit, they had no real choice here. This was the better of two bad decisions.
makecheck 2 hours ago 1 reply      
We should be able to apply security technologies to online discourse. Theoretically it should be possible to separate real identity from virtual identity: maybe I publish a bunch of stuff in such a way that no one can ever figure out who I am but anyone can verify that a series of different comments/articles/etc. were all made by the same person. Then you could decide to follow people who publish things that interest you without having any feasible way to identify them (you cant fire them, you cant meet them, etc.).
schimmy_changa 2 hours ago 0 replies      
Already mature discussion about the issue here: https://news.ycombinator.com/item?id=14948857
hooluupig 1 hour ago 0 replies      
I disapprove of what you say,but i defend to the death your right to say it.Do you still remember 'Don't be evil'?SHAME ON YOU,GOOGLE.
HD134606c 1 hour ago 0 replies      
We haven't seen anything yet. What we're witnessing is the transition from capitalism to Aldous Huxley's Brave New World, and things like this termination are an obvious milestone. That may sound like a bold claim but let me explain.

Allow me to back up a little bit. First off some context: a lot of people don't realize it but we are a lot closer to a post-scarcity world than the world would have you think. Check out this chart which shows GDP per capita since the 1950's. The productivity gains since the 1950's have been absolutely incredible, and the quality of life back then was pretty good. https://fred.stlouisfed.org/series/A939RX0Q048SBEA

Have you ever noticed there's never any dialogue about encouraging men to be stay at home dads, or reducing the overall household number of hours worked per week? Never. The dialogue is always about the "wage gap" and "women have value too" and "rape culture" and "microaggressions". Men who are stay at home dads still get shamed just as much as they did during the 1950's. This is how you know there's something wrong - there is never any serious dialogue about actual equality. Income has in no way, shape, or form, kept up the with the GDP per capita shown in the chart above. There's never any explorations of policies that would actually increase equality, like restricting the number of "investment properties" a man or woman can own, behavior which is clearly parasitic. https://commons.wikimedia.org/wiki/File:US_productivity_and_...It's not in corporate interests to have people have actual equality. Increasing the labor pool without discussions of actual equality makes it so that people can be kept in debt, wages go down and the nexus of power moves away from the family and towards the corporation, which is what is happening.

"Women have value and should be working full time too". The implication here is that if you are not working for money you have no value. Despite common belief, in fact it IS possible to generate value outside of a money context. Many of the world's greatest achievements have occurred outside a money context, eg. the discovery of calculus, wikipedia, linux, countless famous works of art, literature, and philosophy. By saying that you only have value if you earn money is throwing many of the world's most accomplished people under a bus. The reason why "money is the only form of value" is such a horrible mentality is that it leads to people like Mozart dying in poverty and being thrown into a ditch, which actually happened.

Check out charts of combined household numbers of hours worked, you'll see it's going way UP not down, despite the GDP per capita chart shown above. There is clearly something dark in that picture. http://www.bls.gov/opub/working/chart17.pdfI don't think these are idle complaints - feminism in its current form is an ideology that's on a direct collision course with the whole 'robots are about to take all the jobs' reality, which I think is going to come a lot sooner than we realize, and when these two phenoma collide, what's going to happen is that it's not going to be equality (sorry folks) it's going to be Brave New World, an immensely stratified society.

I work in one of Alphabet's departments where everything you do is constantly monitored. Your teammates conduct detailed psychometric analyses of you, beyond the simple perf of yesteryear. It's beyond cult-like. It's Brave New World.

alexpetralia 1 hour ago 0 replies      
Surprised to see no mention of John Haidt and the "illiberal left" thus far.
dcow 1 hour ago 1 reply      
This is why I'll never work for Google. Plain and simple.
pfarnsworth 16 minutes ago 0 replies      
What we all in tech are forgetting is that this gender gap exists in all fields, including medicine. The more interesting problem with medicine is that the pipeline is basically at equity (close to 50/50 male/female graduates).

However, according to this article (https://amino.com/blog/how-the-gender-gap-is-shifting-in-med...) even medicine is seeing severe gender gaps based on the specialty that is chosen.

This was their analysis:

Only 7% of all orthopedic surgeons; 15% of orthopedic surgeons beginning their practice in 2015

9% of urologists; 21% of urologists beginning their practice in 2015

13% of cardiologists; 25% of cardiologists beginning their practice in 2015

Why are there so few women working in these specialties? According to Mayo Clinic's Ian Mwangi, Theres a stereotype that orthopedic surgeons are jocks, that the field requires brute strength. Theres a lengthy feature article on Orthopedics Today that explores other reasons women arent entering orthopedics, including less exposure to the field in medical school, discouragement among advising faculty and deans, and the perception of poor work-life balance.

NPR explains that there may be gender disparity among urologists because of a "misconception of the field ... that urologists treat male problems like prostate cancer and erectile dysfunction."

Women seeking a career in cardiology face similar deterrents, including the impairments to family planning, poor work-life balance, and perceived radiation risks, according to this fellows' perspective published in the Journal of American College of Cardiology.

In most of my research, I found that a lack of female role models was also a key reason why more women didnt enter these specialties, echoing a trend across industries. Doximity reports that women represent only 22% of physician leaders.

robbiemitchell 40 minutes ago 0 replies      
What did he want to happen?

What did he think would happen?

Bonus: Why didn't he just post this to Reddit?

devrandomguy 1 hour ago 3 replies      
As a white man, I have concerns that my ethnic / gender group is being persecuted. How should I raise these issues for discussion, in a professional environment?
m52go 1 hour ago 0 replies      
Relevant: Stossel's Racist Bake Sale


chskfbsixbskffb 50 minutes ago 0 replies      
Why is trolling not civil or substantive?
sna1l 1 hour ago 2 replies      
Does anyone know first hand if any employees agreed with the sentiment in this memo? Shouldn't they be let go as well?
mankash666 1 hour ago 0 replies      
Google's code of conduct might be illegal. Regardless of how inaccurate the employee's claims, firing him for it is possibly grounds for improper termination.
mythrwy 40 minutes ago 0 replies      
And... no one is surprised.

You can be "right" all you want, but then there is politics. It's been going on since at least the days of Socrates.

Every big org does plenty of silly stuff from perspective of rank and file. Sometimes it truly is silly, sometimes there are reasons that aren't obvious at that level.

I suppose Google made the rational decision. We'll have to see if it produces a martyr which is often the longer term outcome of these kind of situations. Doesn't seem like being a rallying point in the cultural war is a situation Google would want to be in at the moment, but who knows.

Thinking it may have been wiser to state publicly that every one is entitled to their opinion, but the official policy of the company was XYZ not open for debate. And then quietly send legal with a bag of cash and a stack of "zip your lip" forms and after the hoopla dies down a bit, an uneventful departure to "pursue other opportunities".

kevingadd 1 hour ago 0 replies      
Intentionally violating your employment agreement and expecting not to be disciplined for it is definitely a strange way to demonstrate your innate superiority over the other gender. Either this guy genuinely believed he was so right that he wouldn't get punished for this, or he wanted to quit and decided to try to martyr himself.

I'm personally not a fan of the memo's content but even if I were, it demonstrates incredibly poor judgement. Posting this sort of thing on internal work message boards and signing it with your name is just plain silly. If you want to have your 'rational discussion' about gender and race in tech and not get penalized for violating an agreement you signed governing your conduct, then do it on your own time and using your own equipment instead of your employer's equipment. This isn't hard. How does someone manage to get a PhD without acquiring a basic understanding of cause and effect?

falcolas 2 hours ago 2 replies      
We fire people for snickering at the word "Dongle".

We witch hunt people who play at Gor in their spare time.

We hound CEOs who disagree with the popular stance on gay marriage.

We fire people who clumsily try to open a dialogue internally to a company.

The tech industry is indeed fucked, but perhaps not for the reasons that are currently popular.

daodedickinson 1 hour ago 0 replies      
It seems he's been fired for being wrong even as Google employees push all sorts of other harmful inaccurate stereotypes publically on social media. The hubris and oppression we feel underneath the unaccountable and inescapable reign of Google and Facebook grows more terrifying with every new revelation.
redthrowaway 1 hour ago 1 reply      
So, stating true, scientifically verified facts that challenge Google's official political ideology is now a fireable offence.

This is insane. The speed at which this ideology has taken over SV is breathtaking, and a massive disincentive for anyone not already so inclined to work there.

throw2016 2 hours ago 0 replies      
There is no reason for an employee to be concerned about who their employer hires as long as said companies are operating within the framework of law. This is none of your business literally.

Google is currently under investigation for pay disparity and are obliged to review everything and ensure there is no sexism at play. If you don't like this surely you should challenge the laws that demand this of Google.

Sending a memo on diversity or any political issue is an aggressive political action in the workplace that can only have one outcome. The context for debates and action about merit and diversity is in the legal democratic space not the office.

This individual is not a victim of free speech but of indiscretion.

LiteskinKanye 2 hours ago 1 reply      
FlashGit 2 hours ago 2 replies      
mcappleton 2 hours ago 2 replies      
muglug 2 hours ago 1 reply      
Oh, so stating that women can't handle stress is bad now? Political correctness gone, uh, sane.

Edit: I had hoped the sarcasm was obvious

senectus1 1 hour ago 0 replies      
Ironic really...SuggestionsI hope its clear that I'm not saying that diversity is bad, that Google or society is 100% fair, thatwe shouldn't try to correct for existing biases, or that minorities have the same experience ofthose in the majority. My larger point is that we have an intolerance for ideas and evidence thatdont fit a certain ideology. Im also not saying that we should restrict people to certain genderroles; Im advocating for quite the opposite: treat people as individuals, not as just anothermember of their group (tribalism).

Google responds to the article by being knee jerk intolerant to the ideology...

Guess he was kinda right?

okabat 2 hours ago 0 replies      
Slate star codex on the science behind gender representation: https://slatestarcodex.com/2017/08/07/contra-grant-on-exagge...

Makes the very clever comparison to medicine and other fields that used to be 100% male.

"Nobody has any real policy disagreements. Everyone can just agree that men and women are equal, that they both have the same rights, that nobody should face harassment or discrimination. We can relax the Permanent State Of Emergency around too few women in tech, and admit that women have the right to go into whatever field they want, and that if they want to go off and be 80% of veterinarians and 74% of forensic scientists, those careers seem good too. We can appreciate the contributions of existing women in tech, make sure the door is open for any new ones who want to join, and start treating each other as human beings again."

CalChris 1 hour ago 0 replies      
I think I've figured out what Damore needs: a daughter. Something tells me that he'd do a 180 in 2 seconds if he had a daughter.
friedman23 2 hours ago 7 replies      
I haven't read the memo but I have read some summaries of it. Even before the memo was released I had a negative opinion of it. Not because someone spoke out their mind but because someone decided to just go and shout out their obviously controversial beliefs to the world expecting for his ideas to be taken seriously and for there to be no negative consequences.

If you want people to care about what you think and have to say, you need to spend time and effort cultivating a positive reputation. Otherwise you just seem like an unhinged idiot.

edit: because people seem to be misunderstanding my point or simply downvoting me because I am describing a reality which makes them uncomfortable I will simply reference Douglas Crockford and the Nodevember debacle.

If what occurred to Crockford occurred to some no name software engineer do you believe there would have been the outpouring of support and articles defending his character? No the person would have likely been fired for no good reason and had their name splashed across a bunch of tech blogs as a sexist.

Tinychain: a pocket-sized implementation of Bitcoin github.com
166 points by jobeirne  9 hours ago   30 comments top 10
brandonhsiao 59 minutes ago 0 replies      
There should be one of those awesome-* Github repos with a list of small, readable implementations of various protocols/technologies.
billconan 6 hours ago 0 replies      
I have a similar project in 1000 lines or so c++, called bingot


It was inspired by another simple crypto currency called basiccoin


at the time, basic coin was only 600 lines of python.

erikpukinskis 6 hours ago 2 replies      
I would love to see this for Ethereum. I was able to understand the Bitcoin protocol fairly quickly with a little reading, but I haven't come across much good writing on the mechanics of the Ethereum protocol. All of the intro texts I've seen are about like "Step 1: install the client" kind of stuff.

I'm not interested so much in how to write smart contracts, so much as how the miners work, how conflicts are resolved, and how the incentive schemes play out.

Would love to get some reading suggestions!

PaulBGD_ 8 hours ago 6 replies      
I'm working on my own blockchain (not specifically bitcoin) implementation just to wrap my head around everything. One thing I'm not getting that also wasn't answered by the source code is how you check timestamps.

I understand the whole network time = median offset + local time thing, however I'm a bit fuzzy on how you check timestamps on previous blocks when you're initially downloading the chain. How do you know that you need to check the timestamp if you can't know if you're on the latest block?

h4l0 7 hours ago 0 replies      
I'm also working on my own cryptocurrency implementation forked from known basiccoin of Zack Hess. Simply I'm trying to make the code more readable, fix bugs and provide better API. Currently I do not have a fine README that explains my intentions but going through the whole code and rewriting most parts made me realize how simple actually blockchain is. Thinking about fine details like how synchronization of blockchain should work is really inspiring. If you want to take a look at the code it is on https://github.com/halilozercan/halocoin
bhalp1 7 hours ago 0 replies      
gaetanrickter 7 hours ago 0 replies      
I can see this useful for all cryptocurrencies as well as alleviating some of the need for hedge funds investing in cryptocurrencies for their clients as brought out here ...Hedge Funds Investing in Cryptocurrencies Exploding 62 in Pipeline https://news.bitcoin.com/hedge-funds-investing-in-cryptocurr...
Hortinstein 8 hours ago 1 reply      
nice work, this is really cool. Source is very readable, great resource for those looking to understand bitcoin. Can't wait to play around with it and deploy it on my little raspberry pi swarm!
wiremine 8 hours ago 3 replies      
Currently reading through a book on bitcoin, so this is extremely timely!

Got me thinking, what are some solid bitcoin/cryptocurrency resources that the HN community would recommend?

throwaway413 8 hours ago 1 reply      
Props for the killer README!
Knolling wikipedia.org
163 points by bestest  10 hours ago   57 comments top 18
state 7 hours ago 1 reply      
"Always. Be. Knolling." [0], is probably my favorite of Tom Sachs' Ten Bullets. As a whole, it's a pretty good office manual. Better for a studio than a startup office but worth a look nonetheless.

0 - https://youtu.be/49p1JVLHUos?t=15m38s

BoppreH 8 hours ago 1 reply      
There's a great subreddit dedicated to this, with lots of content: https://www.reddit.com/r/knolling/

I find it specially interesting when it's done to "dissect" machines, making it look like schematics. I find it helps dispelling the impression of things being black boxes.

__jal 8 hours ago 1 reply      
My grandfather carefully painted black silhouettes of tools that hung behind his workbench, so he could tell at a glance if anything was missing. (There was a house rule of use anything you want, but it better go back where it came from when you were done.)

He didn't do that with his toolboxes, but all but one of those were locked.

I do find I do this as a sort of a tick, while thinking through my next step while making stuff.

smnscu 8 hours ago 0 replies      
Also see Andrew Kim's book 90 on knolling. Andrew is the designer behind minimallyminimal.com, the Microsoft redesign concept, Xbox One S, who now works at Tesla.



kosma 8 hours ago 0 replies      
OT: This has been parodied at least once; see Stackenblochen[1].


falcolas 8 hours ago 0 replies      
Adam Savage is a compulsive knoller - if you watch his podcasts you can frequently see him arranging items in front of him into parallel and perpendicular shapes.
sboselli 9 hours ago 3 replies      
The wikipedia article doesn't mention it, but just like Tom Sachs spent time at Gehry's shop and adopted this term and practice; so did Casey Neistat spent time at Sachs' shop, where he also adopted it (and probably made it more well known than ever before). If you've seen Casey's videos or pictures of his famous NY studio; this is where it all came from.
DanBlake 8 hours ago 3 replies      
I built a entire website essentially built around this - http://www.everydaycarry.com - its been crazy seeing something that started off as a niche hobby turning into a business
hkon 7 hours ago 0 replies      
If you have knolled everything you own but still want more, check out this: http://stupiddesk.com/
ronjouch 4 hours ago 0 replies      
Didn't know this practice has a name! Thanks HN :)

Sharing the excellent bandcamp of two musicians whose profile picture is a knolling of themselves and their gear. Bandcamp: https://qdrpd.bandcamp.com/ , knolling: https://f4.bcbits.com/img/0000904983_10.jpg

spraak 3 hours ago 0 replies      
I never knew it was Knolling that is in almost every stock photo of a desk for a startup's landing page
Froyoh 51 minutes ago 0 replies      
I wish I found about this sooner...
cJ0th 5 hours ago 0 replies      
I can't be the only one who expected this to be about a certain Google project that aimed at organizing knowledge ;)
ttoinou 8 hours ago 0 replies      
.. But, why ?
keyle 5 hours ago 1 reply      
aside/fun: What would be knolling code be like? :)
duck 3 hours ago 0 replies      
What do you call it if you've always aligned things like this?
geff82 7 hours ago 0 replies      
You never stop learning reading HN :)
moomin 8 hours ago 0 replies      
Here I was expecting an article on Google's Wikipedia competitor. But I guess that isn't notable...
Not even wrong: Why does nobody like pilot-wave theory? [pdf] cam.ac.uk
42 points by fanf2  5 hours ago   36 comments top 5
AgentME 1 hour ago 1 reply      
My understanding of QM is only surface-level, so there may be subtleties I'm missing, but this presentation's arguments against the many-worlds interpretation seem pretty weak.

Page 28 grudgingly alludes to the idea that MWI could be simpler than other theories ("Objection based on surprisingly common misconception that standard QM defined solely by Schrdinger's equation ... It is only within a many-worlds framework that this view could begin to make sense"), but later introduces MWI as a contender beginning with a nonsense news headline and then characterizes it as intuitively bizarre for proposing a large multiverse, as if the size for its proposed multiverse was an obvious point against it. It seems to me there's an implied misuse of Occam's razor: "Entities are not to be multiplied without necessity" is most properly applied to the number of rules in a theory, not the amount of matter a theory predicts. (If you apply it based on the amount of matter a theory predicts, then Occam's razor surely should rule out theories that dim clusters of light in the sky correspond to trillions of galaxies like our own, and should prefer theories that predict that they're illusions, reflections, or something otherwise insignificant that fits the observations.)

Page 39 and 40 both quote paragraphs from books that seem more convincing to me than the presentation's seemingly hand-waving refutations. Page 40's refutation appears to do nothing but insist upon PWT's "epiphenomenal 'pointer'", naming our branch as real and explaining away the many deep interactions of other branches as only being "mathematical significant" rather than having "ontological significance". ... Maybe this is a bad time to mention my preference for the Mathematical universe hypothesis ("which posits that all computable mathematical structures (in Gdel's sense) exist"), which seems to make the idea of something having mathematical significance but not ontological significance meaningless.

Koshkin 3 hours ago 3 replies      
I am sure that laymen like it - because it seems to offer an "intuitive" picture of the wave-particle dualism.

Problem is, neither QFT nor modern deeper theories such as strings gain anything from this kind of metaphysical image.

EGreg 5 minutes ago 0 replies      
I love PWT. I think it is the right explanation for what we observe.

Other theories don't rule out nonlocality anyway. And I'd rather have nonlocality than their non realism!!

olegkikin 2 hours ago 5 replies      
I have an unrelated question to the physics nerds. Is it even possible to determine whether the universe is deterministic or not from inside the universe?

Let's imagine a simplified example. I give you two sets of numbers (let's pretend they are atom coordinates). One set is purely random, another one is generated with a very simple formula (example: f(n) = SHA256(N + salt)). I will give you as many of the numbers as you want. Can you determine which set is which (if you don't know the formula ahead of time)?

guscost 3 hours ago 1 reply      
TL;DR: Postmodernists don't like it because it abandons locality in favor of determinism. Modernists like it for the exact same reason.
Eager to Burst His Own Bubble, a Techie Made Apps to Randomize His Life npr.org
316 points by ZeljkoS  15 hours ago   129 comments top 28
et-al 8 hours ago 7 replies      
To a certain extent, this how we used to travel back in the day.

You hung out in a hostel, had conversations with other travelers (instead of thumbing through Instagram), and let the randomness of other people and life, not apps, dictate your itinerary. You walk down a street, "oh hey that looks interesting", and wander down a quiet alley that leads to cute cafe, or jump in the back of a tuk-tuk headed to a waterfall that may or may not really exist, but who cares? You're riding the wave. One of the main reasons for travel/holidays is to break from routine, and the single most significant one can do, bear with me Silicon Valley, is to put away that smartphone. Try exercising your intuition instead of apps.

Many folks nowadays have optimised their lives so much that they've needed to create a noise-generator to bring back some humanity.

reustle 11 hours ago 4 replies      
I actually met Max while we both happened to be in Thailand in 2015. We spent the day on bicycles following the directions his script told him, without hesitation. Regardless of where it was in the city, that's where we went next (a laundry mat, daycare, cafes, and the zoo iirc). Nothing was skipped, because it was what the software told us to do.

Here are some pictures from that day https://goo.gl/photos/gyCNRz2rs7zLJrt79

maxhawkins 10 hours ago 1 reply      
If you want to try this out check out my Facebook group, The Third Party:


You can send us a message to receive a randomly selected event near you. People from all over are attending the events and posting about their experiences in the group.

simmons 11 hours ago 3 replies      
This is great. I think that the illusion of time speeding up as we get older is due to getting into a routine where we do the same thing every day, and the brain begins compressing memories as our everyday experiences become less novel. Using a "randomizer lite" program to shake things up might be a good start to breaking the routine.
owenversteeg 11 hours ago 2 replies      
Ah, but what was his source of randomness? Perhaps he didn't have enough entropy and now he's got to do the last few years all over again ;)

I think I naturally do a bit of that myself: whenever I have some empty time, I fill it with something "random", only instead of choosing randomly I often choose the cheapest option. For example, I once booked a flight to Iran departing hours after I bought the ticket, simply because it was the cheapest option for an interesting place to fly (under $180 round-trip.)

I think the design choices really impact the end result, though. One minor design flaw might result in completely eliminating a whole lot of interesting places or things, which is what I'd be scared of. For example, that cheap fare to Iran was only on one travel search site, which didn't have an API. By selecting one booking website as an API, and letting the algorithm decide for me, I wouldn't have gone to Iran.

Similarly, there are a lot of things that wouldn't seem like an "option" to a computer but are an option to me. I've wanted to see Greenland from the air, so I've been taking a lot of flights that pass over Greenland on the way to places I needed to go anyway. But if the algorithm decided for me, it would probably have booked air tours over Greenland - substantially more expensive in terms of both time and money. It wouldn't be able to say "hey, you know that trip across the Atlantic you have in a few weeks? Why not pay $25 more to have it fly over Greenland?"

dogruck 11 hours ago 5 replies      
Max should meet Luke Rhinehart -- The Dice Man: https://en.wikipedia.org/wiki/The_Dice_Man
Mz 5 hours ago 2 replies      
This "I'm bored with my life. I shall appify it and travel to random countries" stuff is a whole level beyond First World Problems. It is something like First World Problems of the Jet Set. Yet, this article doesn't cast him as a member of the Jet Set.

Modern Life has gone to some rather weird places that were simply inconceivable until incredibly recently.

throwaway2016a 12 hours ago 4 replies      
This is an awesome idea.

I've spent a lot of energy making apps to automate my life and management my schedule. Now I'm tempted to also have it throw in something random now and again. I couldn't go to these extremes (I couldn't pick up and move to another country for instance) but it would be cool to say do things less extreme... like pick a random place for dinner or watch this random Netflix movie.

ZeljkoS 15 hours ago 0 replies      
bartread 7 hours ago 1 reply      
My immediate reaction to this was, "hey, this is awesome," and actually it really is. I think it's an interesting and creative solution to a problem a lot of us share - admittedly whilst recognising that for large swathes of the world's population this would be a great problem to have.

Still, there's that nagging little voice in the back of my head: part of me can't help wondering what will happen, and how people will come to view it, if/when his apps catch on (as I suspect they will).

gehwartzen 2 hours ago 0 replies      
The past eqivilent was flipping to a random page in the yellow pages, throwing a dart at a map, or flipping a coin. It's funny that we need a app to randomize our life at all. I suppose it is largely just how we acces information now.
narrator 10 hours ago 0 replies      
This is the plot of the indie movie called "Buggaboo": http://www.imdb.com/title/tt0206610/

The main character, who is an Indian engineer living in Silicon Valley, says to his friends : "What if you could randomize your life?". That's all I'll say. It's a good movie. I'm not going to spoil it for you guys.

avip 8 hours ago 0 replies      
Inspiration from Borges maybe?

https://en.wikipedia.org/wiki/The_Lottery_in_Babylon(it's in the highly entertaining Ficciones collection)

driverdan 11 hours ago 1 reply      
I love the idea of random surprises.

I created a simple app that picks random items from a Chinese ecommerce site within a set budget. My long term goal was to use ML to learn what each user liked and send them random items on a schedule, selecting from multiple sites.

I never finished it because other things took precedence but the random selection part works.

Anyone interested in this as a service?

rnprince 5 hours ago 0 replies      
Reminds me of undergrad class registration, when everything satisfying a general education requirement would fill up and the server would crash the instant it started.

I didn't choose to take History and Religion of Ancient Israel, but it was the class I got the most out of.

jpatokal 5 hours ago 0 replies      
The Degree Confluence Project is an interesting version of random travel: people visiting arbitrary lat/long intersections.


lolc 11 hours ago 1 reply      
Well he won't meet me. I live in the No-Facebook Bubble.

Though sometimes the parties I help organise are listed on the Facebook by third parties. So there's that slim chance...

dubin 9 hours ago 1 reply      
Anyone who's interested in more ideas along these lines should look at Tyler Cowen's list of how to be less complacent: http://tylercowen.com/complacent-class-quiz/
Tade0 10 hours ago 1 reply      
Great idea, but I don't need an app for that. My fiance takes on this role - we've been living in another country for almost two years because of one idea she had. I'm picking the next place once both of us get bored with this one. Suggestions, as always, welcome. Has to be in the EU though.
628C6l0 10 hours ago 2 replies      
i've been doing it for almost four years now, and it's amazing to see how wrong and how frequently wrong your preconceptions and expectations can be.

you don't need an app to do that though. a spreadsheet is perfect for that purpose (and telling Google Assistant to 'flip a coin' or 'give me a random number between 1 and 20') and gives unlimited flexibility and less reliance on developer to implement features you want. not a lot of people appreciate this, but sometimes trying to accomplish a task with the general-purpose tools you have at hand can lead you to discover solutions so good it is in fact superior to any dedicated tool.

barsonme 2 hours ago 0 replies      
some people use apps, others of us have ADHD that works just as well :-)
clamprecht 12 hours ago 1 reply      
See also, Geohashing: https://en.wikipedia.org/wiki/Geohashing

"Geohashing is an outdoor recreational activity inspired by the webcomic xkcd, in which participants have to reach a random location (chosen by a computer algorithm), prove their achievement by taking a picture of a Global Positioning System (GPS) receiver or another mobile device and then tell the story of their trip online."

thebigspacefuck 10 hours ago 0 replies      
"Peculiar travel suggestions are dancing lessons from God"
ghosttie 9 hours ago 0 replies      
So basically diceliving
allard 5 hours ago 0 replies      
like John Cage
stillhere 11 hours ago 5 replies      
user432468 7 hours ago 3 replies      
NPR so diligently points out white privilege. It couldn't be that he dresses well, has good hygiene, is educated, well off, and speaks politely. It is only because he's white? In San Francisco of all places?

> At first, he was nervous: What if people wouldn't let him in? But, as a kind of unassuming white guy, he actually didn't have this problem. (And Max acknowledges this privilege.)

Dijkstra was right recursion should not be difficult angularindepth.com
151 points by gregorymichael  10 hours ago   124 comments top 34
jerf 9 hours ago 5 replies      
"This task can also be solved using iterative solution, but its way more complicated because we would have to nest for loops but dont know how deep they are nested. The unknown number of nested loops is a common characteristic of all problems that are recursive in their nature and should give you a hint that recursive solution is required."

Would suggest instead something like

"We could solve this directly by managing our own stack of work done and keeping track of where we are in the problems as we divide them up, but it's easier to take advantage of the fact that the programming language is already providing us a call stack that can do this work for us."

Then, later, you can point out that this also helps with the backtracking solution as the programming language is maintaining the old environments; if you were iteratively managing your own stack you'd also have to backtrack manually.

Because that's how you transform a recursive call into an iterative one, not nested for loops. The general points made in the article still hold up, of course.

aaron-santos 9 hours ago 5 replies      
Recursion shouldn't be difficult, but we write recursive programs using one tool: self-referential functions. That's like trying to write structured programs using one tool - goto.

Functional Programming with Bananas, Lenses,Envelopes and Barbed Wire[1] pioneered generalized recursion schemes and influenced work in Haskell[2] and Scala[3]. It's great having the vocabulary to describe recursion and the tools to elegantly recurse, but anamorphism tends to shut down conversations faster than mentioning the m-word.

[1] - https://maartenfokkinga.github.io/utwente/mmf91m.pdf

[2] - https://hackage.haskell.org/package/recursion-schemes

[3] - https://github.com/slamdata/matryoshka

igorbark 9 hours ago 7 replies      
This article touches on some good points. In particular, the part about trusting that the function will do the right thing is something that people who already understand recursion take for granted but is extremely crucial. However:

"Let's tweak our sum function a a bit: ... That's it."

I really dislike articles that earnestly try to convince the audience that some subject which many find extremely difficult to understand is actually very simple. My impression is that the intent is to help people who lack the confidence to engage with the topic, but in my opinion the exact opposite effect is achieved. At that point in the article, nothing about recursion had been explained and it's not reasonable to expect someone who previously did not understand recursion to read that section and come to the conclusion "that's it!" Instead they'll be sitting there, extremely confused by something which they're being reassured is actually quite simple. This is not going to help their confidence.

So while I appreciate the author's intent, I think that someone who is struggling to understand recursion might leave this article worse off than when they started. And I wish this trope would disappear from tutorials.

ballenf 6 hours ago 3 replies      
You don't really understand how difficult recursion is until you've tried to teach a bunch of programming newcomers the concept. A few get it quickly, but for the rest all the "think of it like ..." suggestions don't really help. It's a concept that everyone seems to have to learn through practice (as the author implies). I fell into the trap that if I only explained it clearly enough I could make it easy for others. Everyone learns it their own way and you just have to give students enough opportunities to practice for them to get it. I don't disagree with Dijkstra that the concept shouldn't be difficult, but I do not believe there are any easy paths to the concept for most.

It's one of the most interesting learning processes in that once you get it, it feels almost as simple as a for loop or simple math. But until you get it it feels like the most opaque, impenetrable wall where 1+1 != 2.

Side note, recursion would be slightly easier if the concept of a base case were applied to all functions as preconditions. The idea of a base case really has more applicability than just in recursion, it's just that we don't call it a base case in other situations.

Recursion is also interesting to implement when doing TDD. It comes in a refactoring step, of course, but you don't have to think about the base case(s) because they've already been done. Again, not suggesting this is helpful to a newcomer struggling, but is just an interesting process.

jancsika 9 hours ago 1 reply      
> This is because a recursive sub-function never solves the exact same problem as the original function, but always a simpler version of an original problem.

This isn't always true. There are plenty of recursive functions whose purpose is merely to climb until a condition is met that there are no more rungs in the ladder.

Recursion is difficult because it isn't just taught, it is mythologized as the potentially infinite source of a one's love for programming. If teachers similarly mythologized the "switch" statement I'm sure they could confuse the topic of conditionals just as easily.

Edmond 9 hours ago 2 replies      
Practice makes perfect, I think most learners can get comfortable with recursion as they use it to implement simple recursive solutions (Fibonacci, tree traversals..etc). Over time it becomes quite natural to recognize when and how to use recursion.

One source of confusion that most learners have stems from the seemingly mysterious way that intermediate results are kept track of. This is one of the downsides of not having experience with low-level programming. If you're familiar with assembly language programming and understand how register contents are pushed/poped on/off the stack between sub-routine calls then it is easier to understand what is happening behind the scenes.

aidenn0 5 hours ago 4 replies      
Anything that is self-referential is hard. I remember when we covered proof-by-induction in HS math. There was a significant fraction of the class that did not believe a given proof was sufficient.

If you can't assure yourself of the correctness of a fairly simple recursive function that is given to you, then composing one yourself is going to be nigh impossible.


From the article: Instead trust that it returns correct sum for all elements of an array a[i]

This is exactly the point where many people get hung up. They cannot proceed to reading the next line until they understand how the recursive call works. They are unwilling to trust that the function works as advertised, and cannot shift gears to continue past that. In my experience, telling them to "just trust it" and such seems to not help much.

analog31 6 hours ago 0 replies      
When I was a kid, we learned the principle of mathematical induction in algebra class. That made recursion seem easy.

As for the limitations of recursion on any given system, that's easy to learn because... kids will be kids. Practically the first thing I tried with recursion, ended up with the system operator yelling at me, and learning what a core dump looked like.

hyperpape 8 hours ago 1 reply      
I admit this is complete speculation on my part, but I can't imagine it helps to combine recursion with C-style for-loops.

Both pieces of advice from the article: "trust the recursive function" and "don't try and trace the execution" seem easier if the control flow is simpler.


 function sum(a) { let result = 0; for (let i = 0; i < a.length; i++) { if (Array.isArray(a[i])) { result += sum(a[i]) } else { result += a[i]; } } return result; } function findCeo(employee) { let boss = employee.getBoss(); if (null == boss) { return findCeo(boss); } else { return employee; } }

charlieflowers 35 minutes ago 0 replies      
The Little Schemer is what made me fully comfortable with recursion. It's an amazing teacher.
falcolas 8 hours ago 1 reply      
The two biggest problems with recursion in practical use is that not all compilers can properly handle optimizations around tail calls, and not all TCO capable compilers can handle all tail call cases.

If either case occurs, you run into the issue where you will seemingly randomly blow out your stack space when your list is too long, or your tree too deep. Then you have to fall back to a non-functional idiom (or do some other magical incantations to avoid function stacks, like Clojure's `loop` special form).

I think recursion is a great tool, but it's so poorly implemented in so many languages that it's simply too dangerous to use.

jrs95 7 hours ago 1 reply      
I think part of this can be explained by the way recursion is handled on major platforms. Even something like Python which has some functional leanings is prone to stack overflows with deep-ish recursion without hacky workarounds. Once the tools most people are using are more recursion friendly, I think we'll see it become more mainstream. I think JS has TCO now, at least on Node, which is a huge step in that direction.
eru 5 hours ago 0 replies      
Imperative programmers learn special purpose constructs like loops that only work in simple cases. But since they are so wedded to them, they only reach for the more general tool in more complicated cases. Thus they blame the general tool for the complication.

Just do away with the special cases, and us eg functional calls as the basis for your control flow needs. Then recursion becomes simple.

If necessary, one can introduce loops as a special case later---if working with legacy code or languages that don't properly support function calls. (Eg languages without first class functions and tail call optimizations.)

c517402 1 hour ago 0 replies      
IMHO, the first problem of summing a generalized array of numerical elements is not best solved recursively. I suggest sum(flatten( )) is better and uses less overhead. The subsequent problems under discussion more strongly require recursion.
agumonkey 7 hours ago 0 replies      
Here are the steps in my recursion experience:

1) teenager reading a pascal book: clueless

2) college C course: imperative "print tree", easy

3) advanced compiler: more sophisticated environment thingies, clueless again

4) CGI quadtrees: want fully recursive solution but chokes

5) rediscovery of lisp: recursion wannabe

6) cons-based lists fundamentals: linear converging recursion is everywhere, pattern is easily understood and memorize, everything makes sense

7) FP and lispy combinatorics (treerec, partition, countdown): it seems like point 2) but something is different now.. deep recursion involves self closing types. It's not just repeat until and do not care about what happens you'll fix it down the road. It HAS to make sense, and to typecheck right away.. recursive steps will be used for the previous level it has to fit logically.

If one reaches that level your understanding shifts entirely. Recursion is it's own solution, it's absurd but it's a deep reference point to solve problems since you can assume something that doesn't exist and see if it holds..

Now you get to rediscover math proofs and a lot of late HS and college math. Finally something I wanted to know and recently found out about, list recursion are only a linear pattern of recurrence. It's one dimensional and often on an easy to proove converging variable (cdr + nil?, natural dec + zero?). There are n-dimensional patterns, and litterature about it. I can't recall the books but I'll dig for it.

And finally, Turing among other tried to rethink imperative programs as one big while modifying one big vector variable (aka system state) and proof that this n-dimensional vector could only have some sound values and would reach terminating ones. (I think it's called linear ranking ..)


zmonx 8 hours ago 0 replies      
If you want to really delve into recursion, I recommend to learn a logic programming language like Prolog.

Interestingly, Prolog programs are often efficient and general because, due to the power of unification, Prolog programs are more often tail recursive than functional programs.

For example, we can define a relation called "append" in Prolog as follows:

 append([], Ls, Ls). append([A|As], Ls, [A|Rest]) :- append(As, Ls, Rest).
In Prolog, recursion naturally arises by considering the inductive definitions of terms. For example, you can read the second clause as "If it is the case that append(As, Ls, Rest) holds, then it is also the case that append([A|As], Ls, [A|Rest]) holds".

Note that such cases are not naturally tail recursive in Lisp, for example, because in languages that do not support unification of logical variables, there is one additional call of "cons" after the recursive call to construct the result. Also, the Prolog version is much more general and can be used in several directions! For this reason, "append" is not even a good name for such a relation, because it already implies one particular direction.

Aardwolf 9 hours ago 1 reply      
For one it's not handled well by many languages, expensive stack cost, not tail call optimized, max limits on stack, ... It would be nice if you could write recursive algorithms in code and be sure it ran as well and efficient as when you do it manually with for loops and separate stack data structures.
icc97 6 hours ago 0 replies      
Recursion is hard, but it is a fundamentally great tool.

If I talk to someone who doesn't understand programming, they can usually understand the if and for statements that you might find in food recipes or in legal contracts.

But almost nowhere outside mathematics and computer science (and certainly not as common as recipes) do you come across recursion.

It's mentally hard to think of a problem so that each step gets simpler.

Solving the towers of hanoi is not straight forward, I can't follow it at the speed that I can read as I could with basic math, I have to really think if what they are suggesting works and how does it work.

However when you get to solving hard problems, recursion really helps. I know from having spent hundreds of hours with interview questions. Bugs can make a problem go from taking me 30 minutes to 4+ hours. Where as using recursion and other functional programming concepts, once I got to the solution it would usually be bug free. So whereas it might not be as fast, I would be more confident in my final answer.

JepZ 8 hours ago 2 replies      
Some years ago I had problems with recursions too, because I always had some edge cases which made my functions go wild (endless loops). But one day someone told me, that most recursive programs can be written into the skeleton below:

 function a(){ if(/* Termination Condition */){ return /* Special Element */ } else { /* Recursion: e.g. a + a */ return a() + a() } }
Since that day I love recursions.

eksemplar 7 hours ago 1 reply      
Is recursion difficult? I remember when we did our first recursion in the first year of CS and our professor told us it sould be hard. Then she went on a rant about divide and conquer which made little sense and eventually got to the bit about calling yourself until a criteria.

I think most of us thought it was the easiest part of that year.

jacquesm 9 hours ago 1 reply      
I love the Erlang way of dealing with this:

 % Problem #2 % Each new term in the Fibonacci sequence is generated by % adding the previous two terms. By starting with 1 and 2, % the first 10 terms will be: % 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, ... % By considering the terms in the Fibonacci sequence whose % values do not exceed four million, find the sum of the % even-valued terms. fibsum(_N,M,L) when M >= L -> 0; fibsum(N,M,L) when M rem 2 =:= 0 -> M + fibsum(M, N + M, L); fibsum(N,M,L) when M rem 2 =:= 1 -> fibsum(M, N + M, L). problem_2(L) -> fibsum(1, 2, L) . problem_2_test() -> ?assert(problem_2(100) =:= 44). problem_2() -> problem_2(4000000) .
(When familiarizing myself with a new programming language I usually work through several 10's of the project Euler problems, that way I only have to learn one new thing.)

projektir 9 hours ago 0 replies      
I don't think telling people who find a subject difficult that it is easy is very convincing.

Recursion seems fairly straightforward to me if you think about it in terms of how it's typically implemented (the stack and jumps), but that's not quite the mathematical way of thinking about it, I suppose.

blueyes 4 hours ago 0 replies      
I think a good analogy to teach recursion are humans, or really any living thing, since they too are functions that call themselves. That is, each time you create a new human, it's function is to create a new human, and so on ad infinitum.
euske 2 hours ago 0 replies      
"There's no such thing as a bad student. Only a bad teacher." - Mr. Miyagi
hermanbergwerf 3 hours ago 1 reply      
Didn't read the article. I usually don't have trouble with recursion, however lately I have started to avoid it more and more often. It is true that linear solutions usually look more complex. However in my opinion they also allow better reuse of local variables. Their complexity also depends on the datastructure. Designing a linear datastructure for nested problems is sometimes possible and can allow simple non-recursive solutions which I find more elegant.
mysterydip 8 hours ago 1 reply      
The concept of recursion was easiest to grasp by watching a kid do the maze on a placemat at a restaurant. You go in a line until you find a dead end, then backtrack. Repeat the process until you reach the goal. Each intersection where you make a decision is your function call, and each backtrack is the return result.
naoe8nstaoeusnt 4 hours ago 0 replies      
I liked this. It reminds me of proof by induction. I recall my teacher telling us it was one of the harder concepts, and thinking, "this sounds just like recursion, and seems pretty natural." I think if she had taken away the scary introductory language, students would have been less afraid of it.
merb 7 hours ago 0 replies      
well I do not think recursion is difficult. but I think doing it in a tail-called optimised way, can sometimes be tricky (especially in compilers or node traversals).that's why more and more cps style recursion comes into place more than often. heck even scala has a way to do cps style tail calls: https://www.scala-lang.org/api/current/scala/util/control/Ta...
bsder 8 hours ago 1 reply      
Recursion is different. Almost in exactly the same way that geometric proofs are different.

As such, it requires different ways of teaching.

I have used "The Little Schemer" (even back when it was called "The Little Lisper") to teach lots of "non-programmers" (I hate that description) how to do recursion.

ccvannorman 7 hours ago 0 replies      
I found this HN thread to be quite useful to this conversation: https://news.ycombinator.com/item?id=14949392
wyager 9 hours ago 0 replies      
The first problem in the OP is ill-typed, so if this was intended to help people get a good grasp on recursion, that may be counterproductive.

I never had any problem using recursion, but my ability to reason about it at an abstract level is due in large part to the excellent paper "Functional Programming with Bananas, Lenses, Envelopes and Barbed Wire", which covers various abstract approaches to dealing with recursion schemes. https://maartenfokkinga.github.io/utwente/mmf91m.pdf

voidhorse 8 hours ago 0 replies      
The easiest way to learn recursion imo is by limiting the domain.

For example, we have a thing called a number:


how do we indicate the next number? Well we need a new symbol to distinguish one number from another.

Lets use a marker called prime:

x, x`

Great, now we have two numbers. But let's get a third number. How could we indicate that? Well we could introduce a totally new symbol, like a 'q'.

x, x`, q.

Well that works. but we seem to have just arbitrarily chosen q. There's no immediate obvious aspect of q that lets us know what the next symbol should be if someone asked us to add yet another number. We have three distinct number things but no clear unified way to proceed... But wait, before we took some number x and added a prime to it:x->x` which got us a separate symbol, aka a different number. What if we just do that again but this time with our second number:x`->x``Great. Now we have three numbers, and we can always use the same rule "add a prime mark" to generate the next symbol, so long as we input the last symbol into our rule:

x, x`, x``

We also wind up with neat little equivalences:

x`` = x`, x = x, x, x

or if our prime operation is a function:

x`` = prime(prime(x)) = prime(x`) = x``

so x``, is just the result of prime of prime of x.

So on and so forth until infinity.

What makes recursion harder to work with in CS is that, unlike summation, you can't go to infinity--you need to terminate the operation eventually if you actually want it to do anything. Because of this and because of the fact that our inputs are not little units like numbers but are usually data structures which require more rules to recurse over, recursion in CS is tightly and innately coupled with pattern matching and the notion of defining bottoms or terminating cases which is not the case for operations in mathematics where we do not care about terminating because it just encodes the rule and isn't actually meant to do anything, and when we do need to do something with it we limit it by deciding when to stop (so I guess some neuron firing is the bottom/terminating case for human operations?).

Recursion in CS is also additionally difficult because we have to worry about how the computer stores the results of each recursive application and feeds them back whereas, kind of like a high level language, our brain just manages all that for us when we do math. Also note that we take huge shortcuts with things like Arabic numerals--which are an incredible advance in efficiency but probably obfuscate the recursive roots of mathematics for most learners (e.g. you just learn that some symbol 2 and some symbol 2 is another symbol 4 and not that these are encapsulating successive applications of the same rule over a set of arbitrary symbols--I mean, your brain knows this deep down--that's why it works, but it makes it difficult to immediately comprehend that it is in fact recursive unless people have pointed this out to you--at least that was my experience; I may have just had bad teachers).

such_a_casual 3 hours ago 0 replies      
"This article is aimed to introduce and explain a number of topics and ideas that will help you when practicing recursive problems."

I know I'm not supposed to shit talk the article. I know someone spent a lot of time making it for me and others. However, we've gotta move forward as a community. We have to raise the bar. We have to stop this nonsense.

There is an issue here greater than recursion. I know that's bold of me to say, but I'm serious. Programmers need STOP fucking telling people that they can explain something that requires practice to understand. You cannot learn multiplication from reading a blog post. It has to be drilled into you over and over again. And only after it's been drilled into you what numbers are, what addition and subtraction are, and what carrying and borrowing mean. You learn this over the course of YEARS not 5 minutes reading some shitpost on the internet. Recursion is just like math. It has to be drilled into you. I had zero understanding of this for YEARS. And then I picked up a book that drills recursion into you. Only after writing function after function (functions essentially identical to one another in structure), was I able to understand the IMPORTANCE of recursion. Not by thinking about it, or reasoning about it, by actually doing it, over and over again. Recursion can be explained in 10 seconds. I'll do it right now. Recursion is when a set of instructions starts over from step 1. Or for us programmers, a function which calls itself. Bam. all you need to know about recursion. Guess what. It's not enough to understand what multiplication is, you actually have to be able to use it. You actually have to be able to use recursion. Explaining why multiplication is important is frankly retarded. This post is frankly retarded. STOP pretending that you can explain in 5 minutes what took you months and years to understand. STOP. All you do is make people like me think they're dumb for not getting it. I'M SMARTER THAN YOU! But it took me months of ACTUALLY WRITING RECURSIVE FUNCTIONS to realize HOW FUCKING DUMB YOU ALL ARE. Stop fucking telling people you can teach them calculus in 5 minutes. Stop telling them that all it takes is this one clear explanation. It doesn't! It takes actual work. Fuck the guy who wrote that post. You have no idea how many years I wasted thinking blog posts like this were how to learn. If you don't get why recursion is a big deal, do yourself a favor and forget everything you read in that post, grab a pencil, some paper, and a copy of the Little Lisper. Email me in 6 months when you're finished, so that you can thank me for being the only one to tell you the truth about recursion.

Sohcahtoa82 9 hours ago 3 replies      
Recursion isn't difficult. I never had a problem understanding it. Stepping through a recursive function with a debugger that shows you the current stack trace should make it plain as day how it works.

Not trying to toot my own horn or try to make an appearance on /r/iamverysmart, but where are people struggling when it comes to understanding recursion?

Preventing server overload: limit requests being processed evanjones.ca
108 points by jsnell  10 hours ago   58 comments top 26
Animats 5 hours ago 2 replies      
I've used fair queuing to solve this problem.

I have a site and API which rates other sites, checking them for ad links and trying to match them to real world business records. Inspecting a site takes 10 seconds to 2 minutes. The API, which is usually used from browser add-ons which tag web search results, lets anyone request the rating info for a site.

Most requests are already cached and return immediately. Requests for unknown sites are queued up for the rating engine, which runs in processes separate from the web-facing side, and return "wait" to the requestor.So there's a work queue to manage.

It's managed using fair queuing. If there is no pending request from the requesting IP address, the new request goes into the queue. If there's a request from the same IP address already in the queue, the new request is held until the first one completes, and then added to the end of the main queue. No request is rejected until there are 100 requests from a single IP address. So each IP address competes against itself, and no one IP can hog the system.

A typical overload comes when someone searches for an unusual topic and flips through many pages of search results quickly. This can result in the rating engine having fifty or so new sites to examine within a fewseconds. Those requests will be processed one at a time until the rating engine catches up.

This queuing system held up well when someone tried a test where they fed a huge list of sites into the API without waiting for completion of any of their requests. The rating engine ran busily for a week, but they were only tying up one rating process, and it didn't affect other users at all.

There are no adjustment parameters. It just runs. It's not perfect, but it deals well with legitimate transients and with abuse from a small number of IP addresses.

wutbrodo 5 hours ago 3 replies      
> Over Quota

This application is temporarily over its serving quota. Please try again later.

I'm actually not super sure is this is a single-serving joke site like isthemissiononfire.com, or if this is just a hilarious coincidence.

kyrra 9 hours ago 4 replies      
[I'm a Google employee, opinions are my own]

The GCP team did a blog post similar to this about using Load Shedding to survive a spike[0]. It's definitely a great way to survive a sudden spike in traffic, but not optimal, as serving errors is also bad (unless your clients have some sort of retry mechanism in-place). But if your clients do have a retry mechanism in place (especially one that has a backoff time on retries), it can be a great way to handle high loads.

[0] https://cloudplatform.googleblog.com/2016/12/using-load-shed...

makmanalp 9 hours ago 1 reply      
The thing that trips me up along with request caps a lot is timeouts (briefly mentioned). There are so many network / API / retry timeouts set to arbitrarily large amounts (e.g. 30 seconds) that it's hard to predict how they'll interact, and often you don't know that some of these exist and at which layers they exist (application layer, network layer, DNS, etc).

How you often see this is that your request limiter keeps booting new requests (FIFO/drop tail), and then when you check your currently processing requests, they seem to be all in some busy loop, and hopefully you have some means of checking why. You check, and you find that they're all querying a service that accepts the request but doesn't respond ever (e.g. server down behind a proxy), but for some reason the timeout on the client side is set to 30 seconds so stuff keeps hanging until it's killed.

Anyway, this is to say that in any sufficiently complex networked system scenario, it helps to review all these timeouts holistically and in the context of your application.

boulos 34 minutes ago 0 replies      
For those asking, this seems to be hosted on App Engine and he's gone over his quota. App Engine's throttling is both a blessing and a curse (presumably he's using the free tier and never noticed this). You'll note that it's explicitly saying he's over quota not that it can't handle it ;).

Disclosure: I work on Google Cloud.

kevin_nisbet 8 hours ago 0 replies      
I've seen similar issues occur, but based around GC behavior.

As with the authors example, when requests start stacking up in a process, we also potentially increase the memory load on the GC. It's also possible to cause a sort of pathological case, where the object allocation lives long enough to get promoted to the older generations, and then die, requiring almost constant major GC's to occur to free the promoted objects.

I worked on one system, where all performance testing was done in a lab environment under perfect conditions, and external resources all taking less than 1ms to respond. However, when deployed to a production network, it consistently performed abysmally, losing an order of magnitude of performance to having more outstanding requests being processed, and the associated GC load.

Newer GC's do appear to handle object promotion much better, but it's one more thing to consider, where some GC tuning and reduction in garbage can cause an order of magnitude increase in performance.

Anyway's, one more thing to consider.

kevan 9 hours ago 1 reply      
The SRE book has some useful discussion[1] on this topic too.

[1] https://landing.google.com/sre/book/chapters/addressing-casc...

ninjakeyboard 5 hours ago 0 replies      
Ironically, I'm getting a message saying the server is overloaded.Here is the cached copy: http://webcache.googleusercontent.com/search?q=cache:1oBS2v-...
ot 8 hours ago 1 reply      
A classical solution is to use a control algorithm (for example PI or PID) to avoid having to estimate the overload threshold: the server will just start shedding a fraction of the requests to maintain some metric (CPU utilization, latency, queue length, ...) below some target.

See for example [1]. This is about background jobs, but the same principles apply to load shedding.

[1] http://folk.ntnu.no/skoge/prost/proceedings/acc04/Papers/035...

STRML 7 hours ago 1 reply      
There was a native NodeJS module called "toobusy" that I rewrote a few years back to remove the native dependencies:


It's worth checking out - and helps you achieve this goal without necessarily having to come up with a req/s magicnum; instead you measure actual event loop latency to detect a slowdown.

anonacct37 9 hours ago 1 reply      
A non-general solution that might work for most of us is often goes by the name "shopper prioritization".

Basically if you have a site that does lots of processing (search, orders, etc) then a pretty effective strategy is often to show a generic static "down for maintenance, try again" page. This requires vastly fewer resources to serve since the page can be sitting in web server ram and doesn't result in rpc calls throughout the whole stack.

You can also do something like prioritize logged in users over casual browsers.

Since most http clients are smart clients you can even do some stuff with rngs and timeouts to selectively let people who are willing to wait around back into the site.

Of course if you're an ad-tech company it's likely that serving an error is about as fast as serving something successfully.

kailuowang 9 hours ago 1 reply      
I developed a library to guard against server overload. It automatically gauges the capacity and rejects requests exceeding it.


tyingq 9 hours ago 0 replies      
Haproxy is very flexible for this sort of thing. You can, for example, limit concurrent connections to backends, and place requests in a queue with a specific timeout. Not at all automatic, but at least all the right knobs are there.

Edit: Related, Fedex's tracking is down on their website right now. It's returning quickly, though, with this in the JSON: "Service com.fedex.nxgen.trck.v8.ientities.TrckInterface is busy, max invoke limit reached"

thosakwe 5 hours ago 0 replies      
Ironically enough, clicking the link tells me that your site itself has exceeded its serving quota.

What a world we live in.

clon 4 hours ago 0 replies      

This is fast becoming my go-to tool for any sort of rate limiting at the application level.

TekMol 9 hours ago 2 replies      
This extremely simple solution works nicely for us:

 $load=file_get_contents('/proc/loadavg')[0]; if ($load>10) die('Server load currently too high.');
So when the load gets too high, the server will not procss the request. This will bring the server load down. An easy self regulating system.

palakchokshi 3 hours ago 0 replies      
is this a /s post? I get a server overloaded message when I click the link
cfieber 7 hours ago 0 replies      
This is another good post on the topic:https://stripe.com/blog/rate-limiters
PatrickAuld 5 hours ago 0 replies      
The GCE Over Quota page is perfect.
mnx 9 hours ago 0 replies      
Is reddit using this? At times of high load, getting an error there seems kinda random.
EGreg 4 hours ago 0 replies      
Irony of ironies:


Over Quota

This application is temporarily over its serving quota. Please try again later.

peterwwillis 5 hours ago 0 replies      
In AWS, achieving peak theoretical performance requires queueing & pooling jobs of specific sizes. You are literally going to suffer costly bad performance if you don't implement and limit backlogs and concurrency.

This is basically just "stacks and queues" in a general sense, most often seen as wait-and-retry in an application, and rate controls in a service.

To do this and survive cascading failures you have to know the limits of each part of your stack and implement rate controls for each. An easy way to do this is to design network services that serve requests for applications via an API, rather than giving apps unfettered access to network resources. You can allow applications access to your network service and change rate controls for each app as needed. Alternately, you can add limits to your apps' layers at design time, rather than having to discover the limits by performance testing or trial-and-error.

The author wants a universal solution, which is a bit like saying, please solve "traffic" for me for every mode of transportation. Airplanes have different traffic than cars, so their designs will differ, although they all involve an agent that coordinates traffic flows according to a set of rules.

FRex 8 hours ago 1 reply      
Nginx has some config features related to just that: https://www.nginx.com/blog/rate-limiting-nginx/
EGreg 4 hours ago 0 replies      
My recommendations for rate limiting:

1) Sign session ids that you issue and reject requests that don't have a valid signature. This can be done entirely in software at the router level without any I/O

2) For each session, do authentication. Unauthenticated sessions get lower caps.

3) For authenticated sessions, have various schemes for X units of Y in Z seconds. And then use this database - sharded by Y - before each expensive Y.

4) Y should be prefixed with a user role or payment plan or whatever, followed by the actual resource id. So people can buy another payment plan.

5) Possibly let people pay to access a resource beyond the quota.

6) Make clients recognize quota errors and retry with exponential backoff.

7) Host static resources in the app, fallback to a CDN, and only host dynamic resources on your own source servers.

8) Wherever possible, cache things in the client, keeping in mind that you have to evict the lease recently used items eventually.

abritinthebay 9 hours ago 1 reply      
This is basically how Hapi works in the NodeJS world.

If the event loop is too long (you configure this) it means you're doing too much! So it issues 503s to all new traffic until the server can cope with the requests it's receiving and get the loop time time.

Works very well and you can configure it to do the same with memory limits as well.

Doesn't fix all issues, but it's a good way to keep your site usable by some users at least.

They were partners in fighting crime. The only problem: Neither was a cop atavist.com
39 points by samclemens  7 hours ago   7 comments top 2
sliverstorm 4 hours ago 2 replies      
I don't have much of value to add, but I have met a number of these guys, and I always wonder/wish there was some way we could apply their enthusiasm without giving them the authority to cause real harm.
stordoff 5 hours ago 3 replies      
Slightly off-topic, but popping a paid subscription modal when I'm barely more than a paragraph into a long-form article isn't exactly a good way to get me to stick around.
TFHE: Fast Fully-Homomorphic Encryption Over the Torus tfhe.github.io
166 points by modalduality  13 hours ago   83 comments top 13
phkahler 10 hours ago 2 replies      
One interesting thing about this: You are performing known operations on unknown data. But theoretically you could simulate a generic computer whose program is encrypted data as well, thus enabling unknown operations on unknown data. However, with speeds in the ms per gate we are a long way from that being practical right now.
otoburb 12 hours ago 0 replies      
For a layman like myself, downloading and skimming the reference papers at the bottom of the page, starting with Craig Gentry 2013's paper[1], really helped.

[1] https://eprint.iacr.org/2013/340

tome 11 hours ago 4 replies      
What operations can I do homomorphically with this library? The page says

"With the cloud-keyset, the library can evaluate a net-list of binary gates homomorphically at a rate of about 50 gates per second per core, without decrypting its input. It suffices to provide the sequence of gates, as well as ciphertexts of the input bits. And the library computes ciphertexts of the output bits."

but what does "evaluating a net list of binary gates" come to in practice? What operations could I expect to be able to perform?

Bromskloss 8 hours ago 1 reply      
What does "over the torus" mean here?
anfractuosity 9 hours ago 1 reply      
Sounds very interesting!, I'm going to have to look at in more detail. I'm just wondering how it compares to something like https://github.com/shaih/HElib
Bromskloss 8 hours ago 0 replies      
Would it be correct to say that general homomorphic computing is now (and perhaps already before this) possible, though slow?
saganus 11 hours ago 4 replies      
This looks very interesting!

However, not being an expert on FHE, is there a way to leverage this on current RDBMS systems for example?

It says the library can evaluate binary gates. If we would like to run a SQL query for example, how do we translate it to a series of gates? Is it possible?

Or is this so low level that we basically would need to build our own "processor" with binary gates and then build the rest of the stack on top of it so we can, in the end, run a query?

Can anyone shed some light on how exactly can we take advantage of this library?

michwill 10 hours ago 2 replies      
That's a seriously cool thing to have in the toolbox!

Does it produce only encrypted output, or can it optionally produce unencrypted results also? Can it optionally use public data as an input?

Also I am guessing if it could be accelerated on GPUs. I worked with a guy who accelerated a standard FFT on CUDA 100..1000 times for scientific computations (and later NVidia copied his code, lol). I wonder if something similar can be done here

fenollp 9 hours ago 1 reply      
This looks like the slowest routines are FFT and GEMM (CPU bound).I wonder if one can find DSPs easily for racked servers. Maybe hardware h264 encoders can be repurposed that way?I obviously don't know what I am talking about! Would an FPGA implementation accelerate execution?
sandGorgon 11 hours ago 3 replies      
Will this be useful for machine learning in the same way as this ?


gigatexal 6 hours ago 0 replies      
so who's going to write the Python wrapper to this?
EGreg 11 hours ago 2 replies      
Is this available now? Can we do fast homomorphic encryption baby??
jondubois 11 hours ago 0 replies      
I feel like there are so many use cases for this library.
How to Stop Apologizing for My Stutter, and Other Important Lessons longreads.com
77 points by samclemens  9 hours ago   36 comments top 7
forthelove 8 hours ago 2 replies      
Stutterer here. Was in speech therapy for a long time as a child; my sister actually became a speech therapist b/c of that experience. It will always be a challenge (I'm 37 now), and it seems to rear its ugly head at random times not necessarily tied to stress. Two nights ago I made my first ever best man speech in front of about 150+ people and I maybe stammered once during the 5 or so minutes. Felt amazing and I had people asking me if I was a standup comic or spoke professionally. They have no clue the daily mental gymnastics I perform to master the stuttering.
nodesocket 9 hours ago 4 replies      
I also have a stutter, and I've blogged about being a single founder with it (https://justink.svbtle.com/being-a-founder-with-a-speech-imp...). I've had a speech impediment since I was a kid. I was told I would grow out of it. Still waiting for that to happen. :-)

I've noticed my stuttering has seasons/cycles of worse times, which seem to be tied to stress or anxiety over important calls or meetings. My stuttering has certainly been frustrating and frankly limiting as a founder, since my latest company (https://elasticbyte.net) requires more person-to-person interactions (talking to clients and closing deals).

The singing technique (mysteriously don't stutter when singing), while it is effective, does not work in a business setting. Darn!

hgl 3 hours ago 1 reply      
I'm a stutter too. My condition is a bit special I think. When I'm alone, I don't stutter at all if I think out loud or just read something on the screen, every scary word turns into a piece of cake. But as soon as I realize someone can hear me (even if it's remote like video chat), I start to stutter, pretty badly. I wonder if other people are in the same camp.

I didn't stutter when I was kid until I played with a neighbor kid who did, and it frightened me that I might stutter like him whenever I need to talk. As time progresses, the frightening reinforced, and I never grew out of it. I wonder if it qualifies as classical conditioning?

I also speak other languages, and the severity is different in different languages.

mcone 7 hours ago 0 replies      
I'm also a stutterer. I wish I could share with you the overwhelming shame, humiliation, and ostracism I experienced as a child in my classroom environments. As a result I feel tremendous compassion for others with physical and mental limitations. Speech therapy really helped me help myself, but I still sometimes stutter.
Overtonwindow 9 hours ago 1 reply      
From an engineering perspective, I wish more technologists researched the intersection of speech and technology. For many who stutter there's a device called the SpeechEasy which is marketed as something to improve fluency, and reduce stuttering. Unfortunately for most it doesn't work, for those that it does work the effect wears off, it costs well over $3,000 for the intro pocket model, and typically not covered by insurance. What does it do? It creates an echo of your voice in your ear. Literally just an echo; the coral effect if you've ever seen The King's Speech. This device could be built off the shelf at a MicroCenter for a couple of hundred dollars. There are all kinds of technology for the disabled that is grossly overpriced, and I really wish technologists investigated them more, and found ways to disrupt these expensive devices.
40acres 5 hours ago 1 reply      
I had a stutter growing up, it was terrible. Whenever I got noticed I just froze up, I particularly remember one English teacher who knew I was bad at public speaking yet repeatedly calling on me during class to read sections (more than other kids).

I was terrified of going to high school with a stutter, I knew how cruel kids could be, but out of nowhere my stutter disappeared. I never really looked at to what phenomenon caused my stutter to go away but it was a huge weight off my back.

southphillyman 8 hours ago 2 replies      
"They are rooted in childhoodwhich is the only time stuttering can be reversed. Once youre an adult, there are only ways of hiding"

Don't think this is true. Bill Walton and Bill Withers both were in their 30s before they stopped stuttering I believe.

Choose Your Paradox the downside of the Axiom of Choice billwadge.wordpress.com
66 points by rutenspitz  9 hours ago   11 comments top 4
danharaj 7 hours ago 0 replies      
Constructive/Computable insights into variants of choice:


The way mathematics is taught it's easy to think that there's only one notion of truth and logical justification. For mathematics of finite objects that can be a very compelling story. Once you let infinity in, and you will because infinity is where all the fun's at, there are a multiplicity of concepts of truth.

The axiom of choice is an incredibly powerful statement about infinite objects in a system, namely logic, which is inherently finitary[1]. Any strong statements about how infinity works is going to imply bizarre things to us finite mortals.

For example, if you assume the negation of AC then you can prove that there's a collection of nonempty sets whose cartesian product is empty. Equally bizarre!

AC and not AC are both independent of ZF, which makes sense because ZF is an axiom system for reasoning about monstrously infinite objects. Stronger infinities are harder and harder to say anything about.

In a constructive mathematical universe (yet another notion of truth that's perfectly good), as usual, you have to be careful in how you state the axiom of choice. One formulation is a theorem and another implies law of excluded middle. The latter one ought to be considered the right importation of classical AC into constructive universes [2].

[1] There are infinitary logics but they don't magically allow you to perform infinite amounts of reasoning. All physically plausible logical reasoning is inherently a finite process.

[2] https://pdfs.semanticscholar.org/68d6/790cdbfda26cf71311e137...

ikeboy 5 hours ago 1 reply      
>But it also implies that ZF is consistent. This sounds nice, too, but is actually a disaster. It means that we cant prove the consistency of AD with ZF (assuming the consistency of ZF).

I don't think this is strictly speaking correct. For instance, Con(ZF) itself trivially implies that ZF is consistent (which is literally what it says). But we can prove that ZF+Con(ZF) is consistent, assuming that ZF itself is consistent. (For that matter, we can prove that ZF+not(Con(ZF)) is consistent, again assuming that ZF itself is consistent. Don't ask.) So just because something implies Con(ZF) doesn't mean we can't prove that it plus ZF is consistent, assuming ZF is consistent.

If ZF is allowed access to Con(ZF) it can prove a great many things that ZF alone cannot, without generating any paradoxes, as far as I know.

I may be missing something, this always blows my brain.

soVeryTired 6 hours ago 2 replies      
>the union of two minorities is a minority, and the intersection of two majorities is a majority

Is that a typo? Shouldn't it be the other way around?

DonbunEf7 6 hours ago 0 replies      
Quote: "One possibility is to treat AC as a powerful drug and take it only when necessary. Theorems should come with consumer labels saying what went into them. So if you see a box on the shelf of 'Banach and Tarskis Miracle Duplicator! Feed Multitudes!', it will say on the back of the box 'Contains AC'."

Indeed, many mathematicians are sensitive to this and try to point out when they invoke the Axiom of Choice, and there are other mathematicians who deliberately seek non-AC-powered variants of theorems in order to put them on less nebulous and more constructive ground.

Personally, I view AC as another reason to consider more generalized foundations of mathematics. For example, if you know anything about topoi, it turns out that we can view set theory as a topos on a point, and any topos on a complete Boolean algebra comes pre-equipped with an inherent axiom of choice! [0]

[0] https://ncatlab.org/nlab/show/axiom+of+choice

Podcasting patent is totally dead, appeals court rules arstechnica.com
76 points by danso  5 hours ago   5 comments top 4
dogruck 47 minutes ago 0 replies      
Wouldn't it be great if the listeners who donated to Adam Carolla's fund got their collective $500,000 back?
stevecalifornia 3 hours ago 0 replies      
Do licensees get their money back? What if they were pressured by the lawsuit?
excitom 3 hours ago 1 reply      
I hope this signals the demise of patents that are basically "take something people have been doing since forever, but do it with a computer." In this case, publishing a sequence of stories.
Euclid's Elements (1997) clarku.edu
86 points by nilsocket  12 hours ago   54 comments top 11
wyc 10 hours ago 0 replies      
This is one of the best-written books ever! Most non-fiction works strive (knowingly or not) to reach such a fine form. Words are massaged into terms, sentences into propositions, and certain paragraphs into arguments. This work is the purest form of that, a true paragon with enviable succinctness. Even if you're not into math, try picking up a copy of Euclid's Elements to see how articulate thoughts _can_ be.
aphextron 10 hours ago 2 replies      
Taking a proofs based Euclidean geometry course was the single most useful class I've ever taken. It totally changes the way you think about mathematics, and even made me a better writer. The way it teaches you to begin with a premise and reach logical conclusions through concrete, connected steps is applicable to all fields of thought. Anyone unfamiliar with Euclid should remedy that immediately.
kqr2 10 hours ago 4 replies      
For a nicely illustrated and colored text of Euclid's Elements see: Byrne: Six Books of Euclid :


davidmr 7 hours ago 0 replies      
I spent a year in a "Great Books" college, where there are no textbooks, no lectures, etc., just primary sources progressing roughly chronologically.

For first year math, you go through Elements almost entirely front-to-back. I have mixed feelings about the Great Books programs in general, but the Euclid class was remarkable. It tends not to be a math-heavy group of students, but even those who think they're bad at math can still follow (for the most part) the geometric proofs. It gave all of the students--those of us who were mathematically inclined and those who weren't--a shared vocabulary and methodology for talking about actual mathematics in a way that any of the other math classes I took later after transferring to another university would not have.

If anyone wants to go through them by yourself, do yourself a favor and get the Heath-annotated copies. Those annotations can be a life-saver when you get stuck.

leephillips 7 hours ago 1 reply      
A couple of interesting tidbits about the Elements that I've run into:

It had a profound influence on Abraham Lincoln. He said (he was largely an autodidact) that the Elements taught him what it meant to actually know that something was true, or words to that effect.

It's an example of how profoundly the West is indebted to Arabian (sometimes called "Islamic") scholarship. For centuries, the version that Europeans actually studied was a Latin translation of an Arabic translation that Arab scholars made from the original. They rescued much of our Greek heritage this way.

sevensor 9 hours ago 1 reply      
How much effort must have gone into the Geometry Applet? It's a shame that it's no longer available. (I get a 403 for http://aleph0.clarku.edu/~djoyce/Geometry/Geometry.html.) But even if it were, I don't think I've had a functioning Java browser plugin in half a dozen years.
nathell 10 hours ago 1 reply      
Shouldn't the (1997) in the submission title read (300 BC)?
adrianratnapala 8 hours ago 1 reply      
There is a conventional truth that Elements was a standard textbook for thousands of years, but now that society is no longer a slave to clacissism, nobody actually learns their plane geometry that way. And I had just assumed that I had learned mine some other way.

Indeed I had learned bits and pieces from my father and my teachers. But thinking back, I realized that the solid chunk of education I got, where I saw and undestood a good body of proofs, was from reading Elements while on a family holiday in Canberra.

It's not like there was anything else to do.

jackfoxy 3 hours ago 0 replies      
Excellent! I have Heath's edition and read through it 20 years ago. Prompted me to take it off the shelf and bookmarking this site.
mjfl 10 hours ago 3 replies      
would it be possible to rewrite Euclid's Elements with a proof assistent like Coq?
sillysaurus3 8 hours ago 2 replies      
A lot of people don't know this, but in the same way Euler is pronounced Oiler, Euclid is pronounced Oiklid.
Contra Grant On Exaggerated Differences slatestarcodex.com
33 points by gr__or  2 hours ago   6 comments top 3
ve55 1 hour ago 1 reply      
If you enjoyed this post you should read a few others that have been posted by Scott recently (on the linked website).

If any chance were to be made, the least that I would hope for is that people stop responding to every argument with nothing but hatred. Just pure, unfiltered, hatred. It's very difficult nowadays to find communities, or even individuals, who are good at this.

peacetreefrog 1 hour ago 0 replies      
"In the year 1850, women were locked out of almost every major field, with a few exceptions like nursing and teaching. The average man of the day would have been equally confident that women were unfit for law, unfit for medicine, unfit for mathematics, unfit for linguistics, unfit for engineering, unfit for journalism, unfit for psychology, and unfit for biology. He would have had various sexist justifications women shouldnt be in law because its too competitive and high-pressure; women shouldnt be in medicine because theyre fragile and will faint at the sight of blood; et cetera.

"As the feminist movement gradually took hold, women conquered one of these fields after another. 51% of law students are now female. So are 49.8% of medical students, 45% of math majors, 60% of linguistics majors, 60% of journalism majors, 75% of psychology majors, and 60% of biology postdocs. Yet for some reason, engineering remains only about 20% female.

"And everyone says Aha! I bet its because of negative stereotypes!

"This makes no sense. There were negative stereotypes about everything! Somebody has to explain why the equal and greater negative stereotypes against women in law, medicine, etc were completely powerless, yet for some reason the negative stereotypes in engineering were the ones that took hold and prevented women from succeeding there."

whatrusmoking 31 minutes ago 1 reply      
If this makes sense to you, you haven't actually talked to women in this field who trust you.

Scott has a bad habit of setting up bad strawmen and doing literature reviews without talking to domain experts.

(Frankly the weakness of the arguments he's choosing to debunk make me doubt that he even engaged in this topic with good faith. If you think the opposing view point is summarized by ' "And everyone says Aha! I bet its because of negative stereotypes! ', then you're not serious about this conversation.)

Show HN: Is the stock market going to crash? isthestockmarketgoingtocrash.com
701 points by truffle_pig  14 hours ago   295 comments top 70
pdog 12 hours ago 10 replies      
If you're looking for The Single Greatest Predictor of Future Stock Market Returns[1], here it is: http://www.philosophicaleconomics.com/2013/12/the-single-gre...

This is a long read, but it's worth it. The metric can be calculated in FRED[2], and as a predictor of future returns, it outperforms all of the most common stock market valuation metrics, including cyclically-adjusted price-earnings (CAPE) ratio[3]. (Basically, the average investor portfolio allocation to equities versus bonds and cash is inversely correlated with future returns over the long-term. This works better than pure valuation models because it accounts for supply and demand dynamics.)

[1]: http://www.philosophicaleconomics.com/2013/12/the-single-gre...

[2]: http://research.stlouisfed.org/fred2/graph/?g=qis

[3]: http://www.multpl.com/shiller-pe/

runako 13 hours ago 9 replies      
I've never seen market valuation expressed as market cap as % of GDP. I'm not an economist, so I'll leave the detailed arguments to them. But it would be at least useful to explain why you think this is a meaningful metric as compared to those typically used to measure market valuation (e.g. P/E ratios etc.).

Your graph also ties your valuation metric to the 2000 peak and the 2008 peak. However, there were crashes in 1990 and 1987 as well. Should readers conclude that the 1987 peak level was also too high, and that therefore the last ~30 years have also been too high as well? (Abstaining from investing in the stock market at levels above the 1987 crash would have resulted in the loss of tremendous opportunity for wealth creation.)

There are a lot of opinions implicitly expressed in this site; it would be good to try to make those explicit.

uiri 13 hours ago 2 replies      
For market overvaluation, it says: 9.1 / 10 "DEFCON 4"

DEFCON 5 is peacetime, DEFCON 1 is imminent nuclear war. For example, during the Cuban Missile Crisis, the US reached DEFCON 2. Should this say DEFCON 2 instead? Or is "above" normal readiness the intended meaning?

misja111 12 hours ago 3 replies      
The metric used to calculate market overvaluation is interesting but it has little value for predicting a stock market crash.Let's take he last 3 major US crashes:

1987: this crash was caused by automated trading systems which could run wild in the absence of any prevention regulations such as circuit breakers

2000: the collapse of the dotcom bubble

2008: start of the financial crisis caused mainly by opaque credit default swaps and packaged subprime loans

Of those 3, only the dotcom bubble seems to be a bit related to the market overvaluation metric. And even right before the dotcom bubble crash there were plenty of economic guru's who argued that classic overvaluation metrics were not valid anymore because we were now in a 'new economy'.

The other two crashes were caused by black swans; occurrences that nobody was aware of and that were only understood afterwards. Most likely the next crash will be a black swan as well.

lr4444lr 13 hours ago 2 replies      
Can someone with an actual economics degree explain to me whether it's a valid criticism of the "Market cap as % of GDP" metric that many US companies derive value from multinational labor and consumption, and if not, why not? Thanks in advance.
mendeza 13 hours ago 5 replies      
What about student loan debt, how does that factor into the economy or the stock market being affected?

Right now student loan debt is at 1.4 trillion

source: https://www.debt.org/students/

indescions_2017 8 hours ago 0 replies      
Correct answer, of course, is no one knows, because the future is opaque and unpredictable. And indeed you have some very smart professionals going to cash or directly betting on a 5-10% correction in the S&P500. And a set of equally smart fund managers calling for a 2600 target by mid-2018.

What we can say with some certainty, based on options activity, is that if a single day 3-4% drop in the S&P500 occurs it can trigger a massive unwind in short volatility positions:


And with several political risk factors on the near term horizon, including the possibility of a government shutdown in late September due to the failure of Congress to extend the debt ceiling (yes, they are arguing over who is going to pay to fund the border wall with Mexico). It certainly should surprise no one if a coming tomorrow could be very different than the extraordinarily low-volatility landscape we face today.

The Case For Long Volatility by Eric Peters


pillowkusis 13 hours ago 3 replies      
A site like this seems dangerous at best. Nobody can predict the stock market. Nobody can predict when a stock market is more likely to crash. This site tries to indicate otherwise. Whatever causes the crash it probably won't be one of the indicators listed here.
avip 13 hours ago 1 reply      
I love the design and phrasing. This is just a well-done website.

It would be really interesting to see your collapse pyramid over time. How did it look in 2000? 2008?

benmarten 13 hours ago 2 replies      
How is the heat matrix diagram calculated? It seems to be wrong. Public Debt has a 3.7/10, while it looks like its around 8.5 in the heat diagram.

Looking at the individual ratings:- Household Debt: 5.5 / 10- Market Overvaluation: 9.1/10- Market Volatility: 0.3/10- Public Debt: 3.7/10--> SUM = 18.6/40 or 46.5%

Also I noted: Drawing a linear trend line through the "Market Overvaluation" diagram, does make it look a lot better though. One could argue that people get used to certain levels, hence a growing trend over time.

Taking only these factors into account, it does not look like the market is gonna crash soon. In my opinion it's likely going to be caused by another factor not listed here ;)

daotoad 10 hours ago 0 replies      
Good idea for a website, should be able to get you some nice revenue from intermittent visits. You probably want to focus on financial services for your ads.

I'm not going to say anything about your numbers and your models other than, without the ability to see how they looked at previous crashes, it's hard to see if the site is useful. To the innumerate masses and emotional investors the flickering numbers are persuasive enough. So they really don't matter.

On the bad side, your UX is god-awful. Use an oldish, slightly crappy monitor to look at it and you will discover that your background is indistinguishable from the foreground. The top bar of the box completely disappears, too. Also, a row of buttons is NOT a good tabbed interface--there is no indication that clicking on "Market Volatility" is going to reload all the content below the row of buttons. Maybe make actual tabs, at least make that stuff a distinct box.

This could be a nice little side product to make you some extra money. Get some GA on there, and slowly add features. I think a bit of interactivity and the ability to customize the predictive models through some drag and drop could actually make the page sticky and get people coming back.

omg_ketchup 12 hours ago 3 replies      
Site just displays a blank page. No error or anything.

I think that's a better statement than whatever the app actually does.

ringaroundthetx 5 hours ago 0 replies      
So VIX doesn't give an indication of much. The VIX formula has changed so many times, and the human behavior around the assets that VIX tracks has changed to reflect those changes and the new products those changes are based on.

Different people gamble in weekly S&P500 options than gambled in monthly S&P500 options. Different people gamble in the 5 consequetive week at any given moment weekly options, than gambled in the single week at a time weekly options.

The options market itself has had ebbs and flow in interest.

And the self fulfilling prophecy of keeping the market propped up when everyone buys PUT options expecting it to crash has disillusioned a lot of people from participating at all. People know what the central banks are up to, why pretend to have confidence in any of it. The Swiss bank is printing money to buy US stocks for free. Everyone's creating money through new bond issuances to buy things for free.

This all contributes to a lower VIX.

qubex 13 hours ago 1 reply      
Economist here. You should really keep in mind that the same GDP must go both towards paying off the national debt and paying off household debt. Also you should track commodities (at the very least, the ratio between put & call options).
where_do_i_live 12 hours ago 1 reply      
Your volatility section seems to be a very poor indicator of a future crash in the manner you are using it. Volatility is not a predictor, but instead a descriptor. An analogy I think is the weather stick - Is this stick wet? Then it is raining. It is a very poor item to use in your context.

Further, sustained periods of low volatility often are sometimes indicators of complacency among investors and indicators of higher chances of bubbles. Sustained periods of low volatility are at times indicative of higher future risk of a market crash, not a low predictor. I think you need to re-evaluate how you use volatility.

tveita 12 hours ago 1 reply      
Normalizing household debt against the GDP makes the assumption that we are comparing the debt with the ability to pay for it.

But according to graphs like this, even though the GDP has been rising, median households have not been getting a corresponding increase in income: https://en.wikipedia.org/wiki/Household_income_in_the_United...

So the income we are adjusting against is not necessarily going to the people that are in debt!

saimiam 10 hours ago 0 replies      
I was (sort of) there when the 2000 tech crash happened and was in the thick of it when the 2008 crash happened.

This thread and a few offline conversations made me reexamine what I believe about the stock market and the nature of the 2000 and 2008 collapses. Of course, I'm not an econ nor do I have data to back up anything I'm saying.

All manias, from tulips to tech IPOs to housing bubbles are born when the common person joins the frenzy. On the flip side, the mania collapses when the common person walks away or never shows up the party. For the tech IPO frenzy of 2000, the common person never even showed up to use all those exotic new ideas which were getting funded and going public. During the housing bubble, the common person bought and sold houses which setup the flywheel. Eventually, the common person walked away from the asset in question bringing down the entire charade.

Today, the market is soaring. People are starting to wonder when gravity will reassert itself but in my view, this time the difference is that the common person cannot walk away. Unless adblocking and disdain for social media become extremely mainstream, the common person is so busy amusing themselves to death online that they are not going to leave the tech mania. Companies like FB and Google have made the web sticky.

Does this mean the stock market will rise indefinitely? I don't know. I do know that once there is a captive market comprising everyone online, no company is going to stop advertising or figuring out ways to reach buyers online.

We are in a new age where you just can't get away from the web. We are the product but we also have no way of exiting the dragnet.

yuhong 3 hours ago 0 replies      
Yea, the US economy is based on constantly growing debt basically, which can't last forever. My favorite is the ad bubble now, and ads are basically designed to increase consumption. It is probably worth mentioning China too: http://www.zerohedge.com/news/2017-08-06/chinas-minsky-momen...
Nursie 12 hours ago 1 reply      
"NaN% more overvalued than just prior to the 2008 financial crisis,"

I think there might be a few coding errors still lurking in there.

mxschumacher 12 hours ago 2 replies      
American companies sell products & services outside of the United States. Comparing American GDP with the aggregate value of the US stock-market is deeply misleading, especially given a historical comparison: foreign markets such as China have gained in relative importance over timeframe under consideration.

When looking at debt, one should not just observe the nominal amount, but also the interest rates, which have never been lower. Large companies can tap public debt markets and borrow billions at 1.5% over a timeframe of ten years. Risk is thus lower than the website suggests (at lower interest rates, a company can carry more debt). Additionally, returns to equity will be higher (the I in EBIT is smaller, so profits are bigger).

timsayshey 13 hours ago 2 replies      
Really cool idea. As someone that hasn't really investigated the market indicators for collapse this is really eye opening. It really breaks things down into plain english. Hope this goes to the top for some rational/interesting conversation.
anonu 12 hours ago 1 reply      
As the site makes clear, nobody really ever knows if the market is going to crash. On the market valuation side they claim the current market is overvalued. But overvalued is a relative term... As you have to value versus something, and that something is usually something historical.

The way I see it though is the markets are a big voting machine.. and they're making predictions about the future and incorporating future expectations. With the current US administration still pondering over tax plans and infrastructure stimulus packages that are promised, market may be underpriced???

mathiasben 13 hours ago 2 replies      
I feel as though the "stock market" following the 2008 crisis has become further insulated from the larger economies fundamentals. wages can continue to not keep up with inflation, savings rate continues it's downward slide, household debt service payments consume an ever increasing slice of disposable income, etc... all the while the type of dramatic dislocation event similar to 1929, 1987 are unlikely to occur. the market "circuit breakers" ensure any crash is a slow moving trend and not a single calamitous event.
bluetwo 12 hours ago 2 replies      
The volatility index, or VIX, has become a popular measurement to reference in the context of predicting the market over the past couple years.

The problem is that it does not seem to have any real predictive power and I have yet to see any shred of evidence that the VIX has been shown to have predictive power over the future value of the stock market.

It is calculated from past price variance and is used in calculating the theoretical price of options, but that is it.

Does anyone have any evidence the VIX has value?

csomar 12 hours ago 0 replies      
Does it make sense to have "marketcap" / GDP if the Nasdaq/DowJones has non US companies like Alibaba? Or is it taking these into account?
movedx 1 hour ago 0 replies      
Can you please open source this under an MIT or some license you agree with?
brookside 9 hours ago 0 replies      
A great read on how to capitalize on the upcoming crash! The Sale of a Lifetime: How the Great Bubble Burst of 2017-2019 Can Make You Rich[1]

Also good is the author's earlier book The Great Crash Ahead [2] "outlining why the next financial crash and crisis is inevitable, and just around the corner coming between mid-2012 and early 2015"


1. https://www.amazon.com/Sale-Lifetime-Great-Bubble-2017-2019/...

2. https://www.amazon.com/Great-Crash-Ahead-Strategies-Turned/d...

Kiro 13 hours ago 1 reply      
I got a "Add Create React App Sample to your home screen" notification on my phone.
apsec112 12 hours ago 1 reply      
I think you could estimate much more accurately with the prices of deeply out-of-the-money put options. Those are effectively a betting market on whether stocks will crash or not. We should expect option prices to take into account every major factor (not just these four), because if they didn't, people would get rich by trading on the "missing" info until prices corrected themselves.
cs702 12 hours ago 1 reply      
I love the idea, the simple design, and the humble tone of the byline ("no one knows for sure, but there are indicators that can help us guess. We can chart these indicators to give us the illusion of foresight.").

However, I have two suggestions. First, the numeric rankings (such as "5.5 / 10") need context: why not say something like "10 is the highest value reached in the historical record"?

Second, the explanations you give for chosing these indicators need a bit of work, as evidenced by some of the comments and questions on this thread. Most lay readers won't understand why the ratio of total stock market capitalization to annual GDP is important.

grandinj 12 hours ago 1 reply      
The answer is of course: YES.

The more important question is when, and the answer to that is "who knows".

The market is a chaotic system, with severe non-linear responses. As such, it can remain stable much longer than people think, and crash much harder than anyone expects.

TekMol 13 hours ago 3 replies      
The page strives solely on it's nice graphics, and sensationalist wording. There is little to no content of substance.

For example the page calculates "Market Overvaluation" as the US stock market value divided by the yearly US GDP. Hilarious.

mcguire 12 hours ago 0 replies      
Is this a psychological experiment? All I get on Android Chrome is a white screen.
Glyptodon 11 hours ago 0 replies      
Question (as someone without domain knowledge): could someone explain what the expected relationship between GDP and total stock market value is? GDP represents non-publicly traded, and even non-private activity, while presumably the stock market's valuation is at least somewhat driven by expectations of future growth/profit, rather than current productivity. I don't doubt that there's a relationship of some kind, but what is the simple ratio actually showing?
module0000 10 hours ago 0 replies      
So, if the stock market is hypothetically predicted to crash in 10-25 days - what are you going to do? Short it now? Short it later? Buy?

Just curious what HN readers think. For the giggles...I'm going short when the tape says market sell orders exceed the rate of bid additions, and the opposite for going long. I like long-term analysis as much as the next guy, I just never, ever, ever, ever, ever make decisions based on it.

lg 13 hours ago 2 replies      
Could the fact that a lot of US companies book profits overseas and keep them there for tax reasons foil your assumption about the meaning of a high US market cap:domestic GDP ratio?
AJRF 9 hours ago 0 replies      
"We can measure Market Overvaluation by looking how much the stock market costs vs how much it is providing." Isn't this the opposite of what the stock market is supposed to provide? I assumed valuations for the most part are guided by what a companies outlook is for the future, not the present.
socrates1998 8 hours ago 0 replies      
Low Volatility might actually be an indicator that the stock market is going to blow up, rather than stay calm.

Volatility tends to cluster, and periods with really low volatility are often an indicator that there is a big movement coming.

lordnacho 11 hours ago 0 replies      
One could argue that the volatility scale should be the other way round; that the diamond should be showing extreme values on everything other than household debt, which is middling.

The market is normally calm on the way up, which is why you might think its current upward movement will soon be interrupted by a volatile down-move.

sigmar 9 hours ago 1 reply      
>The VIX is generally consistantly low (10 - 15) until it isn't. To get a sense of what a crisis would look like, we can compare to a few historical values.

What's the point of using a metric that can turn on a dime in a predictive model?

tome 13 hours ago 2 replies      
Market Volatility section:

Current risk:

NaN / 10

"Calm waters"

(I'm using Edge)

coverband 11 hours ago 0 replies      
Interesting analysis, but I'd not have included public debt as a risk factor. If anything, increasing public debt provides upward support for the equity market, regardless of whether the money goes to public investments, tax cuts or bad government spending.
jostmey 12 hours ago 0 replies      
"We can chart this to give us an illusion of foresight"

Got to respect the Author's humility in foretelling the future

cm2187 12 hours ago 0 replies      
Blank page for me. Don't know if it is there but a nice chart is size of the Fed B/S vs S&P 500, since 2005. Suggests a large part of the valuation of stock is generated by QE, which the Fed intends to start withdrawing this year...
neilwilson 12 hours ago 0 replies      
'Public Debt' is a private asset. Why is having more wealth a bad thing?

The idea that being 'in credit' with a sovereign government with its own currency is a problem has been thoroughly debunked. Primarily by Japan.

Time to stop repeating the myth.

peternicky 11 hours ago 0 replies      
Why does this site report "the stock market is closed"?
kmfrk 12 hours ago 0 replies      
If nothing else, I like how this might stir some interesting discussions about the state of the economy.

One thing I'd like is a link to the cited data to make it a little more serious and conducive to debates.

rrggrr 6 hours ago 0 replies      
Household debt should be measured against household income and not against GDP.
unknown_apostle 12 hours ago 0 replies      
Cute site :-) Btw we have the added issue that the volatility of volatility appears to be rising. Meaning periods of apparent big calm turn into big price swings more rapidly.
forbiddenlake 13 hours ago 0 replies      
Says "Stock market is closed" at 10:45AM EDT. Is it really?
tambourine_man 9 hours ago 0 replies      
Site's broken on mobile:


neom 13 hours ago 0 replies      
Distribution of household debt is to significant to look at the health of the economy in this way, especially so when you look at how the GDP is generated and who is generating it.
odammit 11 hours ago 0 replies      
Nah, Trump says it's fine. Don't worry about it. It's the best. May see a dip in 2020.
malynda 11 hours ago 0 replies      
Another pedantic remark: Next to the clock, you should include a timezone. Very interesting!
kurtisc 6 hours ago 0 replies      
>Is the *US stock market going to crash?
franciskim 12 hours ago 0 replies      
Lowest volatility ever in 27 or so years according to VIX apparently, which is actually a warning sign.
JVIDEL 9 hours ago 0 replies      
This is actually a pretty useful site

Don't get to say that a lot around here

wuliwong 12 hours ago 0 replies      
I like the idea. I think it could benefit from some transparency into the calculations.
yosito 11 hours ago 0 replies      
I fully expected this to be a page with the single word "Yes."
davidreiss 12 hours ago 0 replies      
Only the elite know. It's so funny how people think that recessions, depressions, stock market crashes, etc are some "natural" event.

A stock market crash happens when the elite decide there should be a market crash. When they pull money out of the market.

myth_drannon 11 hours ago 0 replies      
You can setup webpack to minify/uglify your source files.
iliveinseattle 12 hours ago 0 replies      
market cap as a percent of gdp is a very bad indicator. In today's world a very large and increasing percentage of revenues is derived from outside the U.S.
woah 12 hours ago 0 replies      
Diagram doesn't work on safari with Adblock
nnd 12 hours ago 0 replies      
What library did you use for the counting animation?
mathiasben 12 hours ago 0 replies      
market overvaluation section could do to include the yield spread on bonds as this is sometimes quoted as a volatility risk indicator.
petters 12 hours ago 0 replies      
> Create React App Sample

Shows up on Chrome mobile.

Sujan 11 hours ago 1 reply      
hathym 12 hours ago 0 replies      
The real question is when?
artursapek 13 hours ago 1 reply      
A high VIX would indicate that the market is crash-ing.
19890903 13 hours ago 2 replies      
Oh dang ! That household debt sure is scary. Where's the real time data sourced from?

> U.S economic risk as of ...

With the click of a button, may be also allow a view of where it was at a point in history...? i.e. U.S economic risk as of [insert point in history]

Great work so far. Simple and usable.

The asteroid impact which led to the extinction of the dinosaurs 3quarksdaily.com
74 points by dangerman  10 hours ago   24 comments top 6
ouid 9 hours ago 0 replies      
In summary, the meteorite collided with a small shallow sea, and vaporized a large volume of sulfur-containing gypsum deposits. The sulfur in the atmosphere interacted with water in the atmosphere and formed sulfuric acid in sufficiently large quantities to cool the earth by about 10C, and make photosynthesis non-viable everywhere on earth for long enough that large herbivorous dinosaurs incapable of hibernation died off entirely.
mannykannot 6 hours ago 2 replies      
The survival of any birds at all has puzzled me, as modern birds seem to have very active lifestyles and metabolisms that would not tolerate long-term scarcity (as many Cretaceous birds flew, I am guessing the same applied to them.)

This is also true of small mammals, though many can hibernate. I sometimes wonder if it was winter in one hemisphere, and all the surviving mammals were hibernating.

Some modern birds, such as scrub jays, cache food for the lean season - perhaps that is how their ancestors survived?

EwanG 10 hours ago 1 reply      
From the article "But it is only last year that we successfully drilled into the impact site, and only now, for the first time, do we really understand why the impact was so fatal. And if the meteorite had arrived ten minutes earlier, or ten minutes later, it would still no doubt have inflicted devastation, but the dinosaurs would still be here and you wouldn't."
jMyles 9 hours ago 3 replies      
I find the tone of absolute certainty off-putting.

> The asteroid is falling towards Earth on a fixed trajectory, but the Earth itself is spinning beneath it, one revolution every 24 hours. This corresponds to around 1,000 miles an hour in the region of interest. So arriving ten minutes earlier or later would have placed the impact some 150 miles further to the East or West. And if this had happened, the asteroid would have missed the shallow gypsum-rich continental shelf, and encountered only the oceans on either side. No gypsum in the impact zone, no sulphuric acid haze, no long deep winter. While things might have been pretty rough for anything living within a couple of thousand miles or so, the rest of the world would hardly have noticed.

> ...

> And if the meteorite had arrived ten minutes earlier, or ten minutes later, it would still no doubt have inflicted devastation, but the dinosaurs would still be here and you wouldn't.

While the specific hypothesis about the chemistry of the impact zone is very cool - and drilling and analysis a very compelling project - I think that the credibility of the conclusion is clouded a bit by the idea that the reader is supposed to accept with certainty that the events unfolded exactly as the author describes to within a 10-minute window.

Reason077 8 hours ago 1 reply      
The BBC covered this very topic in an interesting documentary recently:



(It does make me wonder about anthropogenic SO2 emissions, and how much of a cooling effect these have on our climate!)

perseusprime11 1 hour ago 1 reply      
"And if the meteorite had arrived ten minutes earlier, or ten minutes later, it would still no doubt have inflicted devastation, but the dinosaurs would still be here and you wouldn't."

What does the author mean by we wouldn't be here? Were there no other animals during the Jurassic age? Won't humans be living just like other animals? Maybe be not as advanced but nevertheless be around hiding etc. What am I missing?

Let Forest Fires Burn nytimes.com
66 points by mcone  8 hours ago   78 comments top 17
falcolas 7 hours ago 5 replies      
Speaking as a Montanan here.

The policy of "let some fires burn" has been in practice for a few decades now in our corner of the woods. Ever since the Yellowstone fire in the 80's, we've focused our efforts on containment instead of extinguishment (i.e. make a best effort to keep it away from human settlements); letting land lost to the fires just burn out naturally while stopping them from claiming too much land. Should we let them burn more? Perhaps.

But it's never so easy. The cost of just letting some go when they're a couple dozen or even hundred acres can spiral out of control when you're suddenly attempting to stop a 300,000 acre fire that has already burned out 16 homes and is threatening towns. Yes, firefighters die fighting fires. Civilians also die when an uncontained fire goes from a theoretical threat on the horizon to consuming their house in a matter of minutes.

Fire doesn't respect property boundaries, and can travel a tremendous speeds (upwards of 10mph). And they aren't really like floods - it's not just the forests that burn, it's the flat lands with all the nice dead grasses. Grass fires can move even faster than forest fires, given a wind to push it along.

Humans aren't the only ones affected, either. The livestock which used that land for feed still has to be fed, or face starvation. Argue all you want about the inhumanity of our treatment of livestock generally; letting them starve due to a lack of hay (tied to a lack of money) is worse.

There are some who feel that they should not have to support those affected by natural disasters. I'm sure others will be happy to return the favor.

(300,000 acres ~= 142,000 ha)


SandersAK 7 hours ago 2 replies      
Some forests are self-regulating when it comes to fire. Some forests are not. What defines forest is actually pretty complicated, even for Federal and state agencies. Controlled burns work in conjunction with thinning practices and bone-piling (stacking big burns away from other areas).

The root of the problem has been and continue to be that structurally we separate "fuels" and "prevention" teams in the forest service. Where Fuels deals with assessing when and how to burn, and Prevention operationalizes it and also does fire fighting.

Lastly, fire management suffers from a lack of talent and shared learning as fire career paths involve people moving to different regions which are geographically incredibly diverse. You don't fight fire in the east side of Oregon like you do in Florida.

The lack of cohesion, mixed with a serious lack of funding, and a lot of common misconceptions publicly leads to a mishmash of fire management across the country.

source - was a firefighter. left because of all these issues.

warrenm 8 hours ago 1 reply      
There are trees that cannot reproduce unless their seed pods are baked open

Animals coming back to scorched areas carry all kinds of seeds with them that love the mineral- and carbon-rich environment

Some deer can grow beautiful atypical racks only when feeding in areas ravaged by fire

And burning-out old growth and underbrush allows whole new generations of forest to start because their impediments to growth are gone

Seems like the best thing to do is to let the fires run their course (at least the ones started naturally) .. and maybe note the risks of living near treelines like people who live near water lines know they could be flooded

davidbanham 7 hours ago 2 replies      
In Australia we do a lot of controlled burning through the winter. They're called "hazard reduction burns" and are carefully planned operations involving lots of firefighters and appliances.

The aim is to burn the leaf litter and other fuels, but not let the fire get hot enough to kill established trees.

You start to see signs of regrowth very quickly. HRs are usually done in the late winter. By mid spring the landscape is an incredible sight of bright green new growth out of black tree trunks.

There are a lot of species of trees and plants native to the Australian bush that need fire as part of their breeding and growth cycles. It's integral to land management here.

elihu 7 hours ago 1 reply      
I asked a firefighter what she thought about controlled burns awhile back. She said that one of the advantages of controlled burns is that you can set them when conditions are favorable so that they won't get out of control and burn areas they didn't want to burn just then.

So, the "just let it burn" strategy when fires start might lead to a lot more property damage than a strategy of proactive burning.

Also, from a carbon footprint point of view, it seems advantageous to mix burning and logging, so that not all the carbon returns straight into the atmosphere.

msla 7 hours ago 1 reply      
Letting fires burn is standard policy at this point. We learned it the hard way, particularly after the Big Burn, or the Great Fire of 1910.


It is uncontroversial to let isolated fires burn, but fires close to houses must be contained and fires close to towns must be contained.

aphextron 7 hours ago 4 replies      
The problem with this is local politics. "Let wildfires burn" sounds great when it's not your multimillion dollar vacation home going up in smoke. Yes, in the long run it would be better for everyone involved. But we humans are anything but longsighted when it comes to dollars and cents.
pvaldes 7 hours ago 0 replies      
The problem is that mature trees store things that humans need badly to survive.

When a forest burn entirely you can say goodbye to a lot of much needed water for years, and there is also a problem with carbon. We want carbon stored in the trees, not to be released in the air. And the soil layer needs decades to be created, but can vanish in a week. You can't just think in terms of 10 years. This creatures can live thousands of years.

What is very good for some animals, plants and fungi can be also very bad for other animals, plants and fungi. A net of burn areas isolating islands of intact areas would be a better solution probably providing a patched environment with lots of oportunities.

And we should remember that some very old forests reach a mature level that is fire resistent (plenty of humidity in soil). Laurisilva forests for example do not start burning unless a considerable amount of energy is applied.

sizzzzlerz 7 hours ago 1 reply      
In a book about the smoke jumpers from several years ago, one of the members made a profound statement, jokingly, probably, but maybe not. He said that instead of battling wildfires the way we do now, we just throw money on the fire and wait until the rains come.

With the continued droughts throughout the west and up into Alaska and Canada along with the effects of global warming, the forest lands are just dry tinder waiting for a spark. Having the fires grow so large, they can't be put out will become more and more common. Maybe letting them burn until the rains come is the right solution.

strebler 7 hours ago 1 reply      
Sure, sure, let the fires burn. It's an easy argument to make when it's not your property. What about when the fire does over $3 billion dollars in direct damages, burns 1.5 million acres and forces the evacuation of over 80K people [1]? I'm fine with tax dollars going towards trying to prevent that.

[1] https://en.wikipedia.org/wiki/2016_Fort_McMurray_Wildfire

woliveirajr 7 hours ago 0 replies      
Sometimes small and constant fires clean up the area, cleaning up dead leafs and so on. Constantly stopping those fires can lead to accumulation, until a great and wild fire comes and destroys more than the "nature following the natural fate" would have.
goda90 7 hours ago 1 reply      
It's well known that because of fire fighting, there are build ups of flammable material causing subsequent fires to be bigger and hotter. I wonder if theres a point where if we just let the fires burn, they'll be so hot that the natural benefits will be lost and the forest damaged for much longer than usual. Maybe we need to ease into it by not putting out the fires, but still keeping them from getting really intense.
smileysteve 7 hours ago 1 reply      
I am under the impression that fire departments regularly practice "controlled burns" to destroy brush and have fires occur "naturally", during damp times and away from properties.

At least I am under this impression from controlled burns happening in Talladega National Forest and Cohutta wilderness.

jessaustin 6 hours ago 0 replies      
It seems that USA's War On Fire has had the same results as its other Wars, On Drugs and Terrorism: more fires with more harmful results.

We will be the laughingstocks of history, if humanity somehow survives us.

nitwit005 7 hours ago 1 reply      
The missing step is often that we continue to let communities pop up in these areas that we fully expect to periodically burn down.

Of course, we often still let people build their houses in flood planes, but we seem to be slowly getting better at that.

agumonkey 6 hours ago 0 replies      
Anybody know websites to discuss ideas and tech about forest fire fighting ? (be it thermodynamics, organization, monitoring, chemistry etc)
ktRolster 7 hours ago 1 reply      
This quote from the article is important:

Still, considerable disagreement remains among scientists about exactly how forests should be managed.

Show HN: AsciiDots a 2D esoteric language inspired by circuits github.com
111 points by aaronduino  13 hours ago   24 comments top 9
elsherbini 10 hours ago 1 reply      
This looks like a perfect skeleton for bringing Minecraft redstone to an ascii game like Dwarf Fortress. Very cool! Especially if you could design "sub-circuits" by zooming in, sort of like the chips in Robot Oddysey[0] or in the super circuit maker mod for Minecraft[1].

[0] http://www.formauri.es/personal/pgimeno/temp/RO/stereorecord...

[1] https://www.reddit.com/r/supercircuitmaker/top/

bonyt 10 hours ago 0 replies      
This reminds me of befunge[1], which is also a two dimensional programming language, and has been around for quite a while. This language looks a bit simpler and easier to understand, though. Neat.

[1]: https://en.wikipedia.org/wiki/Befunge

"[I]n Befunge, there is no comment syntax: to embed documentation in the code, the programmer simply routes the control flow around the 'comment' area, so that the text in that area is never executed"

knolan 11 hours ago 0 replies      
This is pretty cool!

On the other hand I'm reminded of the horrors of LabVIEW.

minxomat 11 hours ago 0 replies      
Veedrac 8 hours ago 0 replies      
A friend of mine made an even more minimal thing. Turns out he still has an internet version:


You basically have arbitrary-arity NOR gates, that are your only way of splitting signals. From it you can make anything from XORs to flip-flops to clock signals to anything you want. I remember him making a very fancy 8-segment display counter (that actually counted up over time) in one version of it back in the day.

math0ne 11 hours ago 1 reply      
Awesome, reminds me a lot of "programming" in minecraft!
aaronduino 12 hours ago 5 replies      
Author here,I'm open to any suggestions or feedback.
CGamesPlay 8 hours ago 0 replies      
I like the animation at the beginning but was disappointed that the online version didn't support it.
wgrover 4 hours ago 1 reply      
Very cool! If you're not already familiar with LabVIEW, you might check it out as there are some very interesting similarities and differences between AsciiDots and LabVIEW. A couple that caught my eye:

- LabVIEW creates loops by enclosing the repeated code in a graphical frame of sorts. That ends up being an elegant way to deal with iteration: An array that flows into a loop turns into individual elements inside the loop (one loop iteration per element) and elements that flow out of a loop stack up into an array. I wonder if something similar might work well in AsciiDots? Or more generally, a way to deal with multi-dimensional data by having dots pile up into an array?

- The way AsciiDots deals with libraries (defining inputs and outputs spatially, i.e. top/right/bottom/left) is very similar to how sub-VIs are created in LabVIEW. The way your operations like addition, subtraction, multiplication, and division take two inputs and wait until both inputs arrive to release the result is reminiscent of LabVIEW, but they way you use horizontal and vertical to mean different things is different and very interesting. Direction is mostly meaningless in LabVIEW, so I love the way that AsciiDots ascribes a meaning to horizontal vs. vertical. Can that idea go even deeper?

- Having the "dot" carry information and have a value (not just be the point of execution) is very consistent with LabVIEW and a difference from some other text-based visual languages.

- LabVIEW "wires" (the connections between things) have different thicknesses and colors to tell the user what datatype they carry. It sounds clunky but ends up giving a lot of readability to LabVIEW programs. Not sure if a similar idea could help make AsciiDots programs more readable...?

- LabVIEW has a "highlight execution" mode that slows the program down and highlights the flow of data; very reminiscent of your animated GIF. More than just a pretty gimmick, it helps debug LabVIEW programs and it'd be a great addition to your "try it online" Heroku app.

- For when arithmetic becomes a spaghetti of symbols, LabVIEW has a "formula node" that lets you type in traditional code in C++, thereby mixing graphical and traditional languages. Maybe an interesting paradigm for AsciiDots?

All in all, AsciiDots seems to have a much lower entry barrier than other text-based graphical languages like befunge. It could even be practical for some things, the same way LabVIEW has an interesting niche in the data acquisition and instrument control world. Maybe it could be used for data analysis, where arrays flow from "open file" to "manipulate data" and "plot results"? Or a very interesting way for children to learn some programming concepts, combining the visual-ness of things like Scratch and Lightbot while still having the kids use a text editor?!

I'll shut up now. Fantastic work! Thank you for sharing it!

Light Field Rendering and Streaming for VR and AR [video] gputechconf.com
51 points by thebyrd  9 hours ago   12 comments top 3
erikpukinskis 5 hours ago 1 reply      
OTOY is absolutely at the forefront of digital imagery. They're like MPEG in a Kodak world, that's how different their approach is.


1) Light fields are 4 dimensional images, light field video is a 5 dimensional stream. This is a basic requirement for hand- or head-tracked images, like in headsets or AR devices.

2) We're just getting to the point where real time ray tracing is truly economical, and OTOY is all-in on it. Up til now rendering has been a "bag of tricks" approach, where you try to paint sophisticated paintings on polygons. Many of these tricks fall apart when you try to do the predictive modeling required for 6DOF streaming. You see the reflections painted onto the countertop. Ray tracing actually simulates light.

3) They've fully embraced the cloud. They're offering everything they do as cloud services, which means it can work on every device, for a minimal cost, with no need for customers or users to be on the latest hardware.

4) Open formats. They're not trying to build a portal the way Oculus or Valve is, they're inventing the content pipeline and getting it integrated everywhere they can. I am skeptical any the closed content stores will win, we saw how big the web became, I think a better bet is that the metaverse will be more like the web than the App Store, and that's the bet OTOY is making.

Orbach has been working relentlessly on this vision behind the scenes. Not a lot has been coming out of the company, but I've been watching him lay the groundwork for the whole next generation of content distribution for 5 years now, and he's killing it. Release after release of core building blocks.

otoy 4 hours ago 0 replies      
OTOY is contributing the ORBX container and render graph system to MPEG I part 2 at MPEG 120 as a'tier 1'license (equivalent to MIT license). Paid licenses and patents IMO should not be in the baseline minimal schema/scene graph system, or we will never get to a truly open metaverse. I made as strong a case as I could that plenty of value and IP can still be implemented as services or modules on top of such open framework whenever this issue came up at MPEG 119 last month.

Here is one of the two ORBX docs from MPEG 119, the other (which has the full container schema) I'll post shortly.


laythea 7 hours ago 1 reply      
Is this just cool tech wrapped in a proprietary format requiring hefty licensing fees?
Generate energy with kites kitex.tech
79 points by handpickednames  13 hours ago   44 comments top 13
dietlbomb 6 hours ago 2 replies      
I see stories promoting kite energy harvesting every couple years. They never address (one of) the primary obstacles to a functional unit: transmitting the energy to the ground. They usually hand wave the problem by saying there will be a conductive tether, but they don't mention the challenge in designing a tether that is strong enough to hold the kite, while also conductive enough to transfer power, while light enough for the kite to stay airborne. The usual solution for ground based power transmission it to use a high voltage, but that requires significant insulation separating the conductors, usually air. Other materials tend to fail under high voltage. So they will be limited in maximum voltage, and be forced to have a larger diameter conductor. It would be too heavy and won't fly. Tldr; this is probably a scam.
random3 11 hours ago 3 replies      
Highly related: https://x.company/makani/ founded by Don Montague, one of the Kiteboarding pioneers - likely the first one to create software to design kites and windsurf sails.
timthorn 9 hours ago 1 reply      
Kite Power Systems (http://www.kitepowersystems.com/) came to give a talk to the Institute of Physics in Cambridge earlier this year. They use a dual kite system where one kite is generating whilst the second is being pulled back in, and having demonstrated a 40kW system are about to deploy a 500kW system.
beneater 11 hours ago 1 reply      
Real Engineering did a video about this recently: https://www.youtube.com/watch?v=vMTchVXedkk
slau 10 hours ago 0 replies      
I've shared an office space with Andreas and Gustaf for some time in Copenhagen (they're currently located in the great Founders House workspace). They're an awesome, extremely passionate team. It's been great fun seeing them work on the prototypes, and sitting in a bit, asking a bunch of questions, and Andreas sharing everything he knows about building drones, and all the cool tips he's willing to share.

If you think drones are loud, wait until you've heard an industrial kite/drone combo take off in an empty office space.

scblock 11 hours ago 4 replies      
I feel like a broken record (to myself) because every time I see these concepts I immediately conclude they have no viable commercial future and are a waste of everything except enthusiasm. That's too negative, because that's how some of the old coal guys still see wind in general, but it's not completely crazy. History shows that something like this will get 5-10 years of development money, make a small scale working concept (maybe), and then go nowhere.

I guess I don't know what the answer is. Pursuing new ideas is worthwhile, but I don't expect this company to have real commercial success.

obilgic 11 hours ago 0 replies      
Not sure If it would replace the regular wind turbines with being cost effective, however could be used as a mobile energy generator.
brudgers 11 hours ago 1 reply      
In June, I drove across North Texas on I40 and marveled at the scale of the wind turbines. At those scales (Panhandle I and II generate 400MW according to Wikipedia), I suspect that per turbine materials cost is less of a limiting factor than site acquisition and infrastructure installation. I also wonder about modeling multi-unit installations as shifting wind direction and intensity changes local conditions and moves one kite turbine relative to its neighbors.

It's an interesting approach.

a_imho 11 hours ago 0 replies      
Reminds me of this underwater kite project [1].

I was not able to find the numbers, but they would be interesting. Why not share them?

[1] http://minesto.com/

Grustaf 7 hours ago 0 replies      
Here's an excellent and accessible article about the principles behind kite energy:


randomerr 9 hours ago 1 reply      
I would think that the maintenance and cost to constantly (re)deploy would negate the earnings.

If its not windy enough, you have to take them in. Too windy or rainy, then pull the kites back. Wing is ripped or bow is cracked from wind or sun damage, time to rebuild.

skadamat 11 hours ago 0 replies      
Reminds me of Makani Power! Super cool
stillhere 6 hours ago 0 replies      
What happens when: there is no wind, a flock of geese crash into it, the cable snaps, etc? Better keep these things away from buildings or it will be like 9/11 all over again from multiple angles.
Ask HN: What Happened to the Segment.com Open-Source Fellowships?
20 points by _Marak_  1 hour ago   5 comments top 3
giis 5 minutes ago 0 replies      
> I've now learned they will be soon be releasing a commercial product that is strongly related to the open-source project I submitted.

Is it someone from segment informed you about this upcoming commercial product? How did you learned about this info? What if there is no such plans or upcoming product?

Are you sure about their terms & conditions? Some companies include special clause to own or implement these ideas on their own.

I hope someone from segment will address your concerns.

hitekker 1 hour ago 0 replies      
What was the project that you submitted? And, for comparison, what is the commercial product Segment will be releasing?

I hope someone from Segment responds to your concerns.

jaequery 55 minutes ago 1 reply      
sounds to me like you shouldnt be open sourcing your project if you are thinking like that.
       cached 8 August 2017 04:02:01 GMT