hacker news with inline top comments    .. more ..    16 Jun 2017 News
home   ask   best   5 months ago   
Amazon to Acquire Whole Foods for $13.7B bloomberg.com
649 points by whatok  2 hours ago   374 comments top 56
scotchio 2 hours ago 21 replies      
Oof. I wonder what this means for Whole Foods culture and employees.

If you don't know, John Mackey, the CEO / founder, is a major believer in conscious capitalism and of empowering his employees.

Whole Food employees get paid pretty darn well with some crazy good benefits for their industry and line-of-work (UNION FREE most of the time too!).

WF banks on them being true believers and motivators of the cause - including dedicating a fair amount of paid time to trainings. I've heard mix stories about how Amazon treats employees. I wonder how that will mesh.

So I guess I'm asking:

* What is going to happen with employee culture?

* What is going to happen with all the "Fair Trade" deals WF has in place that might not be the most economical decision now?

* Here comes store automation and hefty lay-offs?

Source: Worked at WF for 3 years

AndrewDucker 2 hours ago 10 replies      

"Amazon did not just buy Whole Foods grocery stores. It bought 431 upper-income, prime-location distribution nodes for everything it does."

phreeza 2 hours ago 13 replies      
Somewhat off topic, but I have had a weird thought circulating in my head (triggered by reading Red Plenty) that I wanted to put out there:

What is the relationship between a centrally planned economy as for example Gosplan was trying to run in the USSR, and the presence of these huge market-like but centralized entities like Amazon and Wal-Mart inside a free-market economy? This becomes particularly interesting because Amazon apparently does not have any particular interest in turning a profit.

A central problem faced by Gosplan was the collection of high quality data about supply chains and the estimation of the utility function of consumers. I would say Amazon is in a pretty good position to do both right now.

Another question, somewhat related: At what point does it become profitable for Amazon to lobby for more redistributive taxation? This might sound paradoxical because you would assume that Amazon represents the interests of it's owners, who would probably suffer under such a taxation scheme. But shouldn't there be a point at which giving more disposable income to poor people will boost the overall income of an entity Amazon (since Amazon doesn't sells many flat-screen TVs but not as many mansions or yachts)?

jarjoura 0 minutes ago 0 replies      
This might just be a strategic move to block Instacart as every WF I've been to recently had created an entire checkout section just for Instacart.
harrisreynolds 46 minutes ago 0 replies      
HUGE news in the grocery delivery space.

Groceries are one of the few large markets that require some proximity to customers due to costs and spoilage. Each grocery store is a type of mini-distribution center for grocery products.

Shipt and Instacart have succeeded to date because they use existing distribution channels and set up marketplaces for the "last-mile" of delivery. This is in contrast to Webvan in the early 2000's who tried to do grocery delivery by building their own distribution and failed spectacularly.

Amazon has become an expert in distribution and logistics. But it is clear that using their current model doesn't generally work with groceries (RIP Webvan, 1998-2001). Bananas need to be treated much differently than books.

So what does Amazon do? But Whole Foods!! A moderate sized grocery store with a significant national footprint and lots of higher income customers.

Now they instantly have a pre-built distribution channel that is already optimized for the grocery business (which again is much different than non-perishable consumer goods etc).

Things definitely just got interesting in this space!! I still believe that Instacart and Shipt can succeed, but they need to maintain a laser focus on making their shoppers and customers happy! And grow as fast as possible while Amazon digests Whole Foods!

[Note: I was the early CTO for Shipt responsible for building their grocery delivery platform and initial engineering team. Go Shipt!]

StevePerkins 2 hours ago 3 replies      
Wow... all the articles I've been reading about Whole Foods lately deal with how they're hurting, and losing market share to more affordable competitors like Trader Joe's.

If this is about Amazon thinking they can turn things around for Whole Foods, then it will certainly mean drastic changes to price, selection, and employee structure.

If this is about Amazon using a brick-and-mortar chain as a tool to help Amazon's own ventures (e.g. grocery delivery, local storage for same-day deliveries, product return and support locations, etc), then it will certainly mean drastic changes what a Whole Foods store even is.

Either way, I can't imagine a course in which Whole Foods as we know it isn't basically over. Which doesn't necessarily bother me (I migrated to Trader Joe's and similar competitors long ago), but does seem like a big deal in the grand scheme of things. The grocery industry was heading in a Walmart-ish direction... and Whole Foods was almost single-handedly responsible for bringing a counterculture into the mainstream, and forcing all the other chains to reverse course and up their game.

lwlml 2 hours ago 2 replies      
Amazon is now (or more so now) like Evil Corp in the Mr. Robot kind of way. You can ship your Evil groceries via Evil Prime purchased via voice-command with your Evil assistant device paid for with your Evil credit-card. Evil.

Frankly I don't buy as much from WF anymore considering that what they carry is much more like "organic junk-food" than actual food.

If you really want to support the "cause", find yourself a local farmer or CSA to buy from and support them directly.


Geekette 1 hour ago 1 reply      
You know the race for global trade dominance is heating up and playing out regionally when the same company tries to buy Slack and Whole Foods in the same week. While one is familiar with conglomerates holding a diverse array of businesses, it's fascinating and slightly unnerving (in terms of market power increasingly concentrated in fewer hands) to see it in motion.
cercatrova 1 hour ago 3 replies      

I watched this video a while ago about what Amazon Go is a precursor for. In summary, if AWS is renting out server infrastructure for people, then you can imagine that Amazon can make the infrastructure to lease out Amazon Go to other stores. They first integrate it with Whole Foods, then as customers expect no more checkout areas, they will only go to Whole Foods because it's faster and maybe cheaper. Then because customers want it everywhere, Amazon could force other stores like Walmart and Target to integrate Amazon Go infrastructure to their stores, without Amazon directly competing with these stores, and make a ton of money leasing it out without spending money on building whole new stores. Of course, the acquisition could also be like nodes for warehousing and delivery, but both avenues are not mutually exclusive.

rmoriz 2 hours ago 4 replies      
Looks to me like the ultimate war between grocery stores in the US will be fought out between Amazon/WFM (premium, delivery) and ALDI/Lidl (discount, no-frills).

As a German I'm amazed by the "food haul" videos on YouTube and the positive feedback for ALDI US (belongs to ALDI Sd), Trader Joe's (ALDI Nord) and Lidl (part of Schwarz Gruppe, they just started in the US yesterday).

tuna-piano 2 hours ago 5 replies      
I'm confused.

Amazon is known to work backward from an imagined future press release, and then do the actions necessary to make that press release. How does Amazon see the future in this case?

-Amazon already has PrimeNow and Amazon Fresh, which offer a great grocery delivery service. For those who have tried these services, it's easy to see how addicting they are vs going to a physical store.

-I can't see Amazon using existing retail stores as distribution centers. I would think you really only need one grocery distribution center for each city in America, and PrimeNow (and AmazonFresh) already has that! Or, has Amazon determined that picking/packing from a retail store is actually efficient? Retail as a DC seems tough to automate, items are in the wrong spot, suspiciously missing, etc. I don't get it.

-I would have guessed that online grocery from highly automated distribution centers is where the majority of the market would be within ~20 years. Does Amazon, the king of online, not think that!?

-Or does Amazon just believe that they can run Whole Foods better than it is currently run?

-Do they just want the purchasing, existing relationships, etc to also sell their customers through other channels?

dkrich 2 hours ago 0 replies      
I don't mean to be a naysayer, I don't really know what this means. But with all of the euphoria I'm hearing regarding Amazon and the doom-and-gloom regarding "traditional" grocers, I can't help but remind myself how horribly bad people are at predicting the future. Most of the people I'm hearing make predictions about what this means have been wrong about everything to this point.

Not saying it won't be a game-changer, but it is fascinating to watch people extrapolate this out to the extremes as soon as the news hits the ticker.

To put it another way- if it were that obvious that buying Whole Foods was going to make Amazon the dominant grocery seller, why weren't people predicting that they would do it all along and asking what they were waiting for? It isn't until Amazon acts that people say "Oh yeah, that was the right move."

dawhizkid 2 hours ago 3 replies      
This is not good for Instacart. My understanding is WF was their biggest partner and source of revenue. No way amazon will use them.
nojvek 56 minutes ago 0 replies      
I quite like wholefoods. Now that Amazon will own it I feel less likely to support it.

Amazon is a cut throat profit making machine at the expense of human exploitation.

Wholefoods felt like the little guy who was trying to do things differently from mainstream supermarkets.

tuna-piano 1 hour ago 0 replies      
Some quotes from a 3 month old article about Amazon and physical grocery[1]

The whole premise [of online grocery] is that youre saving people a trip to the store, but people actually like going to the store to buy groceries.

A bunch of smart people at Amazon have been thinking about re-imagining the next phase of physical retail. They want more share of the wallet, and habitual, frequent use of Amazon for groceries is the ultimate goal.

"Long term, a stronger grocery business could position Amazon to become a wholesale food-distribution business serving supermarkets, convenience stores, restaurants, hotels, hospitals and schools. "

"A group of Amazon executives met late last year to discuss the disadvantage Amazon faced compared with grocery competitors such as Wal-Mart and Kroger because of its lack of physical stores and customer apprehension about buying fresh foods online. They decided they needed something more to jump-start Amazons grocery push beyond plans already under way for the Amazon Go convenience store, modeled for urban areas, and drive-in grocery pick-up stations suited for the suburbs."

[1] https://www.bloomberg.com/news/features/2017-03-20/inside-am...

guiomie 12 minutes ago 0 replies      
"The transaction also may help Amazon sideline Instacart Inc., a startup that has delivered grocery orders from Whole Foods stores in more than 20 states and Washington, D.C."

... I wonder how this will play. Is Instacart's business threaten by losing Whole Food as a client?

pfarnsworth 0 minutes ago 0 replies      
What about Instacart? They had a sweet deal, if it continues then it means Amazon will buy them. If it gets cancelled it means Instacart is fucked.
hew 2 hours ago 2 replies      
This is a Walmart counterattack. Walmart is on the verge of capturing large swathes of busy "middle-class buyer" pickup orders.

Amazon saw it and responded quickly.

tmaly 2 hours ago 1 reply      
Grocery stores are a tough market with razor thin margins. Whole Foods has a ton of competitors like Trader Joes that happens to still have organic options.

Since I did not see it asked, any idea if we will see some new Prime benefit at Wholefoods?

xutopia 2 hours ago 1 reply      
Amazon wants to kill retail and their biggest threat is Walmart. Walmart currently uses groceries as a way to get people in stores and when they're in the store they buy other things as well.

Now imagine if you are Amazon and you'd like all those people going to Walmart to just use your services. If I could get my groceries delivered to my door and only need to go to stores once in a blue moon I'd be really happy.

akeck 14 minutes ago 0 replies      
aws --region us-west-sf-mission grocery order-items --shopping-list-id=listid-0083899e196816ab7 --stored-id=store-0af21d71bd9c03a1a --address-id=address-0d7811218fb1ccb33 --address-type=residence --tags="Key=type,Value=weekly_groceries"
joshfarrant 2 hours ago 0 replies      
$13.7B is quite a satisfying amount.

That's $1 for every year since the Big Bang.

danm07 33 minutes ago 0 replies      
I wonder if instead of Amazon building out their own stores, they're going to use Whole Food's... unlikely but possibility is there if they do it well.
JumpCrisscross 2 hours ago 3 replies      
Aw, poor Blue Apron.
nihaar 2 hours ago 0 replies      
I'd be curious about what this also means for Instacart given their strategic partnership with Whole Foods and their direct competition with Amazon Fresh.
foobarbazetc 7 minutes ago 0 replies      
RIP Instacart.
dforrestwilson 1 hour ago 0 replies      
What happens to Sprout's Farmers Market?

People seem to think it will be bought, but this would seem to be negative news because the price paid for Whole Foods was about half what Sprout's is trading at.

djhworld 2 hours ago 1 reply      
I wonder if they'll roll out the tech they showed a few months ago for the store in Seattle where you didn't have to check out.
wcchandler 2 hours ago 0 replies      
Now that they control the distribution, they should control the supply. Well functioning, scalable greenhouse solutions should definitely pique Bezo's interests. I'd love to be a head grower over a large operation like that.
aezell 2 hours ago 0 replies      
In this part of the country, it seems like Kroger's ClickList service has been very successful. I'd imagine other larger chains in other parts of the country are doing similar things. Sure, it's not delivery but if you are driving right past the store on the way home and don't even have to get out of the car, it's a pretty close competitor. That's especially true in smaller communities where Amazon's grocery delivery doesn't yet make sense. Might this be a partial reason that Amazon is making a bet on the grocery market?
awinter-py 42 minutes ago 0 replies      
ugh. In 1966 the supreme court forced the 3rd largest grocery chain in LA to divest the 6th-largest (which they bought in 1960). https://en.wikipedia.org/wiki/Vons
plusbryan 2 hours ago 1 reply      
What does this mean for Instacart I wonder? Will Amazon shut them off from their stores?
joycey 2 hours ago 0 replies      
I can't wait for drones to deliver organic quinoa straight to my door.
coleca 2 hours ago 0 replies      
If the brick and mortar grocers weren't scared of Amazon before, they sure are now.
dpratt 2 hours ago 1 reply      
I wonder how long until the first beta of Elastic Grocery System?
duxup 1 hour ago 0 replies      
I buy things on amazon when I feel the price is good or better than I can get locally. Whole Foods doesn't fit that mold at all.... now maybe they provide different products but then Whole Foods isn't Whole Foods anymore.
panooper 51 minutes ago 0 replies      
Pretty good rundown of reactions and implications of this deal here: http://circulaat.com/main/amazon-to-acquire-whole-foods-in-1...
cphoover 2 hours ago 1 reply      
I think this might signal the walk in walk out payment paradigm of Amazon Go going national. If we could say goodbye to lines, I am all for it.
balozi 2 hours ago 1 reply      
Nothing Amazon does makes sense immediately. Only in hindsight does the strategy emerge.

What would be also helpful would be for Amazon to disrupt the wine and liquor sector.

tgb29 1 hour ago 0 replies      
I do my banking and food shopping at Walmart; I wonder if Whole Foods will offer this. Probably not, but I'm still excited to see Amazon moving into the food retail business.
tcbawo 2 hours ago 2 replies      
Perishable goods like groceries are one of the few businesses that Amazon is not in yet. My guess is that Amazon is looking to take Pantry to the next level.
chaostheory 57 minutes ago 0 replies      
What percentage of instacart's business is tied to Whole Foods?
psadri 40 minutes ago 0 replies      
What will happen to Instacart?
buryat 2 hours ago 1 reply      
I wonder if they're going to implement Amazon Go in Whole Foods https://www.amazon.com/b?node=16008589011
sidegrid 1 hour ago 0 replies      
This is another step in how an online book store slowly becomes the Umbrella Corporation.
tonychu 19 minutes ago 0 replies      
Whole Foods market share will continue to increase with rising homosexual and gender confused culture.
skinnymuch 1 hour ago 0 replies      
Wow all cash? Amazon's cash pile will be pretty small relatively speaking after this. Less than $10B.
davidiach 2 hours ago 3 replies      
I wonder if they will keep the brand or change it to Amazon.
ckastner 2 hours ago 0 replies      
What an exciting development! It will be interesting to watch Amazon apply its logistics experience to a large-scale groceries operation.
codecamper 2 hours ago 2 replies      
Amazon is getting too big.

How's this for an investment strategy... Long AMZN, short a basket of all other mid to large retailers & grocers.

masonicb00m 1 hour ago 0 replies      
Sucks for Instacart.
hendzen 1 hour ago 0 replies      
Why would they buy Whole Foods as opposed to Grubhub/Seamless. Seems like a better fit...
md2be 2 hours ago 0 replies      
Instacarte is screwed
ethbro 2 hours ago 1 reply      
"Da $#&@?!"

Sometimes I forget just how big Amazon is.

... But it's nice to know I can get my artisanal, single-source, Fair Trade, organic, small-farm, no-GMO cucumber water with one day shipping now.

dkural 2 hours ago 3 replies      
Better buy than Slack for $9B.
salesguy222 2 hours ago 2 replies      
Checkmate, brick and mortar!! Amazon is eating the world.
Justin Kan Raises $10.5MM for New Legal Startup techcrunch.com
51 points by imjk  1 hour ago   16 comments top 9
justin 31 minutes ago 2 replies      
Hello HN!

We are excited to be building in the legal space. Raising money is a necessary step, but Atrium LTS' biggest accomplishment so far is the great team of experts we have assembled here. My cofounders are Augie Rakow (former partner at Orrick, where he worked on over 100 financings and represented Cruise through their acquisition to GM), BeBe Chueh (lawyer turned founder who sold her last company to LegalZoom) and Chris Smoak (Amazon AWS, multiple time YC founder, founder of early FB payments platform Gambit). I think our early team is doing a great job of moving quickly and we are looking for more talented people to join that team.

Read more here: http://www.atriumlts.com

gnicholas 12 minutes ago 0 replies      
Most startups are pitched with a problem statement: our users want X, and here's how we're going to give it to them. This article talks about how lawyers don't do X and how Justin thinks they should but not necessarily that they want to.

As someone who spent 7 years working in biglaw, I can attest that there are many technology-related things that would make lawyers more efficient. For example, the large law firm I worked for ran Windows XP until 2011. This lag is due to (1) the fact that senior lawyers are generally very resistant to new technologies; and (2) the fact that law firms run custom/old software (to do things like format legal briefs to meet the standards of the relevant jurisdiction), which is mission-critical and may not be compatible with software updates.

I'm curious to see what Justin is building here, and hopefully some of the other news coverage with elucidate where the demand is going to come from for his new product/services.

Perhaps they're envisioning their "user" as corporate legal clients, who are going to pressure their law firm to use his project management software? An indirect model like this might make sense, but it sounds like a tough slog to me.

Anyone else know more?

yannyu 44 minutes ago 0 replies      
There actually is a lot of legal-industry-specific technology out there that most people have never heard of. It works well enough for what most lawyers and firms want, but it seems to have hit a local maxima for one major reason: nearly all the software and tools are written around Microsoft Word and the Office suite.

Anyone wanting to disrupt the legal industry (in the US) is going to have to deal with MS Word in one way or another. There are hundreds of thousands of attorneys who have worked with nothing but Word and Word-related tools for decades, so someone entering the space will likely need to build project management, collaboration, and communication systems around the Word ecosystem. That or build a product so compelling and easy to migrate to that people are willing to walk away from Word.

redm 7 minutes ago 1 reply      
For any legal startup to work, Lawyers either A) have to really want what he's building or, B) have to have it to be competitive against other firms.

I think both of these are pretty high bars, especially since Lawyers bill by the hour.

Finding a good efficient lawyer is much more about the Lawyers personality, knowledge, and ability and much less about the technology. A good lawyer can solve things very quickly at little cost. A bad lawyer can take the same matter, spend a ton of money researching it, working it, discussing it, etc, and still come out with a poor result.

I'm interested to see what he's building, but I won't be surprised if it doesn't upend the legal world.

neom 9 minutes ago 0 replies      
This is awesome and I'm really happy to see venture dollars being spent on the legal industry. One of my gripes has always been that quality legal is traditionally expensive to obtain because the "quality" part in legal work tends to come from literally practicing law for a long time and knowledge about many things quite deeply. Over time law firms obviously grow in standing and therefore size, increasing costs. This can easily leave a huge part of society under-served. In theory, much of the cost is research by humans, and I presume research by humans garnering key insights could be done somewhat algorithmically by scanning case law for key points. I hope we see more and more automation and ML applied to the industry of providing council around aspects of the law.
SmellTheGlove 37 minutes ago 0 replies      
This is a good idea. I hope it gets traction (I'd love to work on this problem as well).

As for why some of this isn't done yet: The legal profession is old school. If you want to know where your work is, you can ask for a status memo, and you'll get billed for the time it takes to write it. Or the phone call. Either way. In industries where the product is the billable hour, you'll find things get done the way they've always been done.

You'll find small and mid size firms using more technology, but you'll also find that the more highly a company thinks of itself, the more it thinks it needs a large firm (which will bring its legacy processes, because they do work, even if not entirely efficient). Nobody gets fired for hiring Skadden, but then you also don't get to bitch when they do things the way they have for the last 50 years either. They didn't bring in $2.5B last year by accident - they're effective, and are going to be averse to process change if it risks outcomes.

If you're working on tech in this space, you also need to be aware of that. It's a tough sell to larger firms. They won't sacrifice outcomes or billables because it's all working really well for them, and they always have a glut of un/underemployed contract attorneys if they just need to throw highly educated bodies at a problem.

joshuaheard 25 minutes ago 0 replies      
I built a legal app years ago. One problem I had getting investors is that the legal market is so small. Microsoft dropped it as a vertical years ago. Corporate law is even a smaller segment of the legal industry. VC law is an even smaller segment of corporate law. How big is the market for this product?

Edit: I see from their website they claim it is a "$96 billion industry". There are only 1.2 million lawyers in the U.S. (including retired), so I am curious what this number represents.

philfrasty 42 minutes ago 2 replies      
What happened to Justin's Snapchat? Still kicking? Kinda stopped following when Insta-Stories launched. Miss Klaus though <3
seibelj 50 minutes ago 0 replies      
I have no idea about the existing legal tech space, but this sounds like a really good idea if it doesn't already exist. I hope it works out
The Calibre Content Server calibre-ebook.com
137 points by dabber  4 hours ago   51 comments top 9
criddell 3 hours ago 3 replies      
I've always just used Calibre because it's a nice way for me to strip DRM and then archive my books. I've never really used it any kind of active way. Since I buy pretty much everything from Amazon, I already effectively have a content server.

Does anybody have a link to an online server (with public domain books)? I'm curious to see what the presentation is like. What's the typography like? Does the screen dim after 30s? What's the browser battery consumption like compared to an ereader app?

Long term, my big concern about ebooks is DRM. Amazon's most recent version (KFX) hasn't been cracked and workarounds involve getting Amazon to send you an older version of the file with older, crappier hyphenation and layout. I've started mostly buying DRM free books from Amazon, but they don't make it easy to find them.

robk 5 minutes ago 0 replies      
I love Calibre and contribute recipe updates often but don't really see the value of the server. I script things to download via command line and then send to kindle over email. It's more robust than the front end and less hassle than running a server you have to use the browser to access. Given file sizes are small (this week's Economist is about 15mb) it's perfectly Fine to us email.
fest 2 hours ago 1 reply      
I thought that this was a recently added feature. Turns out it has been there since at least 2011 (based on a forum thread about it). Will have to try it!
blfr 3 hours ago 7 replies      
I love it in principle but, considering how small most books are relatively to storage space offered by modern devices, even mobile ones, what is the upside of storing content centrally instead of just carrying a copy with you?
dancsi 3 hours ago 3 replies      
It would be great if this feature was extended so that I can sync my books across different Calibre installations on different machines.
r3bl 2 hours ago 0 replies      
They have also made Calibre look a little less crappy on high definition screens, and added a conversion to .docx: https://calibre-ebook.com/new-in/twelve
fest 2 hours ago 0 replies      
It looks like you still need Calibre GUI to add books- this is read-only interface.

I was hoping that this could replace my Google Drive folder with various papers on interesting topics I'd like to read.

rufugee 3 hours ago 1 reply      
I've been searching for a solution to handle the various ebooks I buy and to allow me to annotate them in a centrally stored location. I have kindle purchases, pdfs, epubs, and mobis from various publishers.

Moon Reader (http://www.moondownload.com) would be great if it had a desktop or web-based client...however, it's only supported on Android. If Calibre can give me this experience, it's value just increased immensely. Looking forward to trying this.

dubyte 2 hours ago 0 replies      
I did a little go server to expose an opds basend of a dir structure


So far I only tested on moonreader

C.S. Lewis's Greatest Fiction Was Telling Kids They Would Like Turkish Delight atlasobscura.com
42 points by tbirrell  3 hours ago   10 comments top 7
quotemstr 10 minutes ago 0 replies      
How can you _not_ like Turkish delight? I have some every time I see it --- which is rarely, granted.
barrkel 16 minutes ago 0 replies      
I still like Fry's Turkish Delight (made by Cadbury). The combination of thin chocolate covering to the weight and almost chewy mouthfeel of the rose-flavoured gel works well - it makes what chocolate there is more unctuous. The weight of it is part of what sells it for me, even though I expect it's mostly water captured in the gel.

Actual sugar-dusted Turkish Delight cubes are less impressive. I don't mind the rosewater flavour - that's what I associated with the product - but it's just thick jelly. There's no contrast. And since I despise nuts, I'll not be eating any with crunchy bits either.

tcopeland 10 minutes ago 0 replies      
When my wife's parents come visit us from Moldova they usually bring over a box of Turkish Delight; it's always unveiled with cries of "Turkish Delight for the young prince! Ha ha ha!"
DoofusOfDeath 28 minutes ago 1 reply      
When my kids were younger, I took to reading the the Lion, the Witch, and the Wardrobe to my kids at the dinner table. Just for fun, on some of those evenings we'd have Turkish Delight at the table. The whole experience was really nice overall.

But I have to say, many of the Turkish Delight flavor variations weren't very appealing: rosewater, mint, etc.

We tended to like the fruitier ones such as lemon and orange. (And there was also pistachio, which I recall liking.)

jandrese 43 minutes ago 1 reply      
I liked the Robot Chicken sketch on this where the nerd thinks Turkish Delight is some kind of sex act.

Caution: NSFW


sp332 35 minutes ago 1 reply      
This may be my sugar-drenched American palette, but I don't remember turkish delight being that sweet. The rosewater ones are really good though.
vostok 29 minutes ago 0 replies      
As a child, I really liked the nut flavored Turkish delight. I never liked the regular rosewater flavor though.
Why Your Encrypted Database Is Not Secure [pdf] iacr.org
11 points by blacksmythe  38 minutes ago   3 comments top 3
falcolas 1 minute ago 0 replies      
A few years back, Google worked through their attempts at encrypt MySQL from end to end; it's a damned hard problem when you consider all the bits which would need to be encrypted to protect against someone with access to the OS or hypervisor.

I haven't heard that they were successful.

thruflo22 5 minutes ago 0 replies      
Any good recommendations for running Postgres transparently as a fully encrypted database (not just on a table by table basis)?

Or alternatively is this a problem that's better / just as viably solved with an encrypted filesystem?

marvel_boy 33 minutes ago 0 replies      
Is SqliteCipher (sqlite) also vulnerable to the techniques shown on paper?
Harmony Explained: Progress Towards a Scientific Theory of Music (2012) arxiv.org
36 points by adamnemecek  5 hours ago   6 comments top 5
stfwn 24 minutes ago 0 replies      
> You can't start a science textbook like that.

'Music theory books' make no claim to be scientific. Western music is a cultural construct and there are books that explain how to practice this particular field. For insiders, the three initial statements the author attempts to burn down make perfect sense. What it means to locate a (part of) a music piece in a 'harmonic system' makes just as much sense to musicians as the rest of us calling a lollypop 'sweet'. I wouldn't criticise cooks for not describing taste by the molecular contents of their ingredients.

While it's interesting to dissect music differently from how musicians may analyse it, completely rejecting local experts' understanding of a phenomenon runs counter to finding out more. Is a farmer a complete fail whale for knowing when to plant his crops based on when a certain type of flower first blooms, rather than the precise position of the earth relative to the sun?

If I would only read music theory by this author, I would still not have a clue how to play the piano. He/she wastes a lot of words on what could otherwise be described as the difference between Metis and Techne, or local knowledge and technical knowledge.

I pasted two long quotes about the distinction between Metis and Techne here: https://pastebin.com/ZYKJDFRN

laretluval 4 minutes ago 0 replies      
Does this work address the fact that people raised in non-Western musical cultures don't experience pleasure/displeasure based on harmony?


michaf 48 minutes ago 0 replies      
There is a newer version 2 from 2014: https://arxiv.org/html/1202.4212v2
lyra_comms 47 minutes ago 1 reply      
Reads as a working lit review (huge collection of ideas the author has come across) rather than a compact synthesis.

Discuss on Lyra: https://hellolyra.com/c/382

Show HN: Revert the Twitter UI changes github.com
23 points by kamranahmed_se  4 hours ago   24 comments top 8
strict9 1 minute ago 0 replies      
Yes, people will complain every time a UI is significantly changed. But there's something else too.

Perception is that significant product development resources were devoted to adding rounded corners. But issues often associated with twitter seem to get worse and little product development resources. A few include: harrassment, racism, armies of bots for brigading, and the increasing use of the platform as an instrument by a foreign power to exert influence on the American electorate.

Maybe they are trying and not communicating it well. But a UI refresh is clear indication of spent product development, and those issues seem to be getting worse.

Sirikon 1 hour ago 6 replies      
But why.

Why that hate for everynewUInowadays.

Slackwise 36 minutes ago 0 replies      
Skip Twitter.com and start using https://tweetdeck.twitter.com instead.
intoverflow2 23 minutes ago 0 replies      
It's honestly pretty much the same UI anyway. Totally just feels like a CSS hack on the web side.
_Microft 1 hour ago 2 replies      
Is that actually advisable? The same attitude made people stick with Windows XP or Firefox 3.5, don't you think?
satsuma 45 minutes ago 0 replies      
Meh, Twitter's new UI isn't bad enough for me to worry about reverting it locally. If they pulled a YouTube that'd make sense, and even there I got used to it.
anotheryou 1 hour ago 1 reply      
why not using stylish (pure CSS, but the few icons you can blob in)?
egwynn 1 hour ago 1 reply      
The Maison de Verre, one of the world's best-preserved modernist buildings jstor.org
21 points by tintinnabula  4 hours ago   1 comment top
throwanem 39 minutes ago 0 replies      
Google Images has interior views and elevations which the article does not. I particularly like the grand salon from what appears to be an overlooking balcony: http://atlasofinteriors.polimi-cooperation.org/wp-content/up...

I think this must be the first time in my life I've actually seen modernism achieve a satisfactory combination of the industrial, on the one hand, and the humanistic, on the other - I'm not sure I love the nonskid flooring everywhere, and a Persian rug or similar might help tie together the conversation area centered on the divan, but the overall impression is nonetheless quite an appealing one. Based on the geometry of the space, I suspect the aural environment would be similarly pleasant - the high, airy volume would support close conversation without turning a more declamative style into a farrago of reverberation, and support a nuanced piano performance without swallowing quiet passages or ruining loud ones.

The salon being apparently the focus of the place, or at least of that photography I've found, it's hard to get a sense of the other rooms, and modernist architecture in general has a nasty habit of being a bear to maintain due to design features more oriented toward presenting the architect's vision than withstanding the loads placed on an edifice by its environment. But at the very least, should I ever find myself in the neighborhood, I'll most certainly have to find some way to visit.

Polynesian people used a kind of binary number system 600 years ago (2013) nature.com
172 points by ghewgill  13 hours ago   94 comments top 19
netcan 2 hours ago 1 reply      
The Polynesian history of the (european) middle ages keeps getting more interesting and mysterious.

Evidence for polynesian contact with the Americas is mounting. Regardless, the medieval coonization of all the pacific islands is evidence of seamanship and navigation skills European mariners didn't achieve until well into the age of sail. European sailing was much more trade/cargo oriented & we know a lot less about "pre-contact" polynesian maritime cultre so it's hard to do put the technologies side-by-side. That said, it looks the polynesian ability to aim for and land on small remote islands circa 300CE was not surpassed by any other culture until the 18th century.

Then there's the strange case of Easter Island with it's giant stone statues. Giant stone statues & "how-the-f%$-did-they-do-this" megalithic monuments have existed for a long time (EG the sphinx), but these are usually produced by populous civic cultures. "High civilization" to use an out of date term. For a tiny island to produce this kind of an artistic culture without obvious predecessors or descendants is strange.

As always in prehistory (in the literal, "before written records" sense), what we know is very little. But, it wouldn't be hard for me to believe that surprising numerical or mathematical systems existed there. It's a culture that has surprised us to the point of disbelief several times.

jloughry 50 minutes ago 0 replies      
Roger Bacon wrote about binary notation in the 13th century, although he called bits `fingers'. John Wilkins credited Bacon in his book Mercury, or the Silent and Swift Messenger published in 1641.

Wilkins' book is basically a tutorial on communications security (COMSEC) that touches on channel coding, reliability, secrecy, key management, cryptanalysis, OPSEC, and data compression.

ETA: Wilkins takes a clear position on the full disclosure debate but cautions of the hazard of experimenting with crypto technologies:

 `...the chiefe experiments are of such nature, that they cannot be frequently practised, without just cause of suspicion, when it is in the Magistrates power to prevent them.'

_Codemonkeyism 9 hours ago 5 replies      
"Polynesian people used binary numbers 600 years ago"

Disgrace to Nature for click bait. The number system might be clever but as described is not binary numbers.

Especially the claim in the introduction

"Binary arithmetic, the basis of all virtually digital computation today, is usually said to have been invented at the start of the eighteenth century by the German mathematician Gottfried Leibniz."

implying Leibnitz didn't invent binary and equate the existence of 4 numbers (10,20,40,80) which lack the simplicity of 2 states by representing it with KTPV with the invention of the binary number system is sensationalist.

Interesting the quote

Its puzzling that anybody would come up with such a solution, especially on a tiny island with a small population, Bender and Beller say.

It's only puzzling to scientists from "Department of Psychosocial Science" but would not be puzzling to scientists from the "Department of Mathematics".

petersjt014 12 hours ago 2 replies      
Of course, we all know what they would have used if they were real pros:


protomyth 12 hours ago 2 replies      
A couple of Native American tribes used octal because base 8 works for the number of things you can carry between the fingers. Would have made computing a bit simpler if that had been adopted instead of base 10.
1001101 12 hours ago 3 replies      
While not intended to be used mathematically, the I Ching hexagrams developed around 1000AD represented the binary numbers from 0-63 (2^6), as discovered by Leibniz who was quite fond of binary himself.
pfooti 11 hours ago 1 reply      
If this stuff is interesting to you, you should read about Geoff Saxe's work [0] with the Oksapmin people of Papua New Guinea. This book is newish (and I haven't read it, but I've read almost all of the studies the book is built on) and synthesizes a couple of decades of research into how these people count and talk about number and how that practice changed over time (there's some fascinating conclusions about how the base changed from base-27 to base-20 when western money, in particular the 1 pound : 20 shilling ratio, came to be more widely-used).

And yeah, more or less base-27. They used body parts to refer to numbers 1-27, and often anything higher than that was just "a lot", but they could count more using the system if needed. How the Oksapmin people mingled both traditional counting and modern arithmetic is pretty fascinating (and says a lot about the underlying cognitive processes that make humans able to reason about number).

0: http://a.co/6gbRM1x

ajarmst 11 hours ago 0 replies      
While the described system has some powers of 2, which is interesting, it is a far cry from positional binary. It most certainly does not prefigure the system described by Leibniz, nor the application of arithmetical operations to logic, which was the profound insight in question. This shares far more with things like the base-60 fractions used by Mesopotamians than anything recognizably similar to modern binary and logic systems. (Which of course derive a great deal from Aristotle, significantly more than 600 years ago).
erpellan 3 hours ago 0 replies      
This reminds me of the story of ancient Ethiopians using binary maths. I read about it a while back and finally managed to track it down (the original link seems dead): https://web.archive.org/web/20170609082700/http://www.uh.edu...
vorg 10 hours ago 0 replies      
> Mangarevans combined base-10 representation with a binary system. They had number words for 1 to 10, and then for 10 multiplied by several powers of 2

Because base-10 seems to originate from two handfuls of fingers, I wonder why they didn't end up with a base-5 representation with a binary system?

> takau (K) means 10; paua (P) means 20; tataua (T) is 40; and varu (V) stands for 80

By using their numeral for five (say, F) to mean 5 in this scheme, they could have gotten rid of their numerals for 6 to 9. So 157 would be VTPKF2 instead of VTPK7.

janwillemb 12 hours ago 1 reply      
They had number words for 1 to 10, and then for 10 multiplied by several powers of 2. The word takau (which Bender and Beller denote as K) means 10; paua (P) means 20; tataua (T) is 40; and varu (V) stands for 80. In this notation, for example, 70 is TPK and 57 is TK7.

To my non-mathematically trained ears this doesn't sound like a binary system at all, but more like the highly inefficient Roman system. Am I missing something?

thinkingkong 10 hours ago 0 replies      
If you think this is cool you might also find the Marshall Island stick charts, and Polynesian navigation methods particularly intriguing. See



whatshisface 12 hours ago 12 replies      
Computers use base two because their most natural unit can occupy one of a couple states: 0 or 1. One, two, base two.

Human hands, on the other foot, have ten fingers. Since our favorite mapping between things and integers is finger counting, we naturally end up with more than two states. Zero, one, two, three, four, five, six, seven, eight, nine, and the fully-extended state, ten. That's eleven states, which is why the global human standard is base-eleven.

Wait, what?

userbinator 12 hours ago 0 replies      
They find that the former Mangarevans combined base-10 representation with a binary system. They had number words for 1 to 10, and then for 10 multiplied by several powers of 2.

Interesting. Perhaps this is the very first use of binary-coded-decimal?

jlebrech 6 hours ago 1 reply      
daxfohl 11 hours ago 1 reply      
0b1001011000 years ago? Anybody?
erkose 9 hours ago 1 reply      
Binary numbers have been in use since the balance scale was developed.
aaron-lebo 12 hours ago 2 replies      
The ability of the Polynesians to settle huge stretches of the Pacific might be one of the most impressive human achievements ever.

Odds are they reached South America even if we never find direct evidence (there is some).

Eire_Banshee 2 hours ago 1 reply      
Meh. Ancient babylonians used a base-16 number system. Does it really matter what number system they used?
What American universities can learn from Germany (2016) washingtonpost.com
68 points by Tomte  4 hours ago   56 comments top 9
igk 3 hours ago 4 replies      
I 'm not sure if they are talking about "dualstudium", but I think this is a thing which needs to be advanced. There should be a separation between "skill education"(learning things for a job) and "human education" (learning things to become a better human/just for understanding). The latter we already enforce with 9 year Schulpflicht (which one could debate about prolonging) and then leave to the individual.

University education should not be or promise jobs, it should be about understanding certain fields on the deep level and being confronted with the bleeding edge of knowledge. Right now we are conflating the two, meaning we have a large number of students wasting their time in classrooms when for their goal they should either be getting deeper, tougher confrontation with the subject (if they want to do research/understand deeply) or practical "on the job" education (if they want to get a job). BWL is the worst culprit of this as far as my friends who studied it describe it.

medymed 3 hours ago 3 replies      
Apart from stratification of students into tracks, America could also learn not to crush its students with debt.

A mandatory cap of tuition for universities receiving federal-debt funded students would be a reasonable way to harness the beast of tuition price inflation created by zealous federal financing of debt and questionable promises of benefits of "college", which seem to have very little to do with training and everything to do with signaling.

Colleges could then decide whether to accept students with federal debt by lowering tuition (and removing bloat) or not. Interested to hear ideas of pitfalls with this approach, and if they are worse than the situation of non-bankruptcy eligible mountains debt now.

sytelus 1 hour ago 3 replies      
TLDR; In Germany, 60% of people get vocational training as their "higher education" designed for vary specific jobs. Companies sign contract with young people at the age of 15-16. They get certification for this specific training by age of 20 and then get on with those jobs. At least half of the education system is designed as a "factory" that produces specific skilled workers that industry asks for.

Personally I'm not sure if any country should be adopting this model.

onli 3 hours ago 0 replies      
Should be noted that this is only one segment of german job education. There are of course also classical university courses without much or any contact to real job experience.

Also, what they describe is less the University route than the non-university route, being educated by and in the business. Germany has also universities that cooperate with business, but that's not the norm. The classical division is between universities (Studium) and education at the job (Ausbildung). Not sure how universities could learn from the latter, as it is an alternative to them.

biztos 2 hours ago 0 replies      
Probably o/t for the article:

My own experience is very dated, but in the late 80's and early 90's I attended public university in both countries.

Back then it really struck me how infantilized the student experience in the US was; but also how much more we actually worked, because a little arbitrary structure is pretty helpful for manic 19-year-olds.

For the self-motivated, academically serious student it was a kind of paradise with slightly shabby institutions, some of them very very old. Most professors, even the famous ones, were accessible, but in lieue of "homework" you were expected to keep up on the reading, for which at best you got a list of books (at worst: go figure it out). Nobody seemed to care at all about grades.

Even then, when public universities in the US were cheap, it was remarkable how much cheaper studying in Germany was, from the minimal fees to the ubiquitous subsidies. Student housing in California was a racket - the dorm cost way more than a shared apartment - but in Germany it was so laughably cheap you never really minded the bad parts. We even had a semi-official student bar with 1DM (50c) beers.

I think partly because of the freedom, partly because of the low cost, and partly because working-class kids mostly didn't go to university, there was basically no attention paid to the need to one day make a living. I didn't stay long enough to find out if that was a good or a bad thing; I did my best to avoid that question in the US anyway, and paid the usual price for it.

geff82 1 hour ago 1 reply      
The article is completely "fake news" as it is not about Universities in Germany. It is about apprenticeship. Apprenticeship is a widely accepted path to professional education here and works as described in the article. Your salary is also not as high as mentioned - normally around 500-800 Euros/month.Besides universities and apprenticeship, there are "dual studies" where you are partly working for (usually big) enterprises and partly studying for a university degree. But only a few students are doing this, most either do apprenticeship (which includes interesting professions like programmer or system administrator) or university studies.
merb 3 hours ago 4 replies      
please don't.I live in germany and our academic system is flawed to the extreme.Even the dual systeme.

I mean common our universities and schools has a worse system than america (and a lot of other countries).The dual system should've bridged school and labor skills, however it does not work that good. Especially not in IT.The problem a lot of kids come from different schools, some have higher educations some not. so the school tries to learn a lot of things which they should've teached in schools already, while they train you not a lot of practical things in school. Also a lot of practical things you learn in the school are mostly already learned since the school progresses way too slow.

As already said I think the problem starts already in the kindergarten/early school. And then progresses in middle schools.

in germany you basically get thrown to life either after your school years or after your university/higher schools. You always start with nearly no experience, even after you did apprenticeship in dual system you are not prepared for your life.

worse is even if you did something in west germany it could be slightly different in other parts of germany, so switching schools can be really really hard.

our system basically enforces "lax-iness" until a certain point in life and than you either get it or you get lost in the system. and it get's worse due to our polictics that loosens more and more rules

rayiner 3 hours ago 0 replies      
Pretty rich to use Drexel and Northeastern, two private colleges with a cost of attendance approaching $70,000 per year, as any sort of model.
Zahlmeister 1 hour ago 2 replies      
This article is total bullshit and so is the "dual education" model.

"Duale Ausbildung" has nothing to do with universities, if you go that route you spend up to four years working at (on average) well below 1000 while attending a government school that teaches at a rather low level. The government has to subsidize the income of those people and the employer, in some cases.


Worse, if you don't buy into the "dual education" thing, you can't get most of these jobs. We're not talking about great jobs either, it's most jobs that don't require university education. A lot of that is stuff you could be trained for on-the-job in six months or less.

Don't trust German unemployment statistics, it's basically the government paying at every end to have people employed at all costs and often at very low pay.

Also, don't buy into the German educational model, it's one of the most discriminatory systems in the world, even though it is "free" on paper, it's all about weeding out people at a young age so they're not allowed to attend the "free" university. We have parents suing the teachers of ten-year-olds for the grades they give, because of the impact it can have on their careers.



Examples to get started with Deep Learning without learning any of the math github.com
74 points by pplonski86  5 hours ago   8 comments top 4
shoshin23 3 hours ago 1 reply      
I would say that this title is misleading. A lot of what is presented there needs a strong grasp of deep learning(and the other underlying concepts behind them.), without which all you'll do is load the examples on Xcode and run them.

Moreover, I would probably encourage people to read examples of Tensorflow or Caffe2 running on iOS rather than something like Forge. Forge is an interesting project but won't really help you if you don't have a clue about MPS or Deep Learning.

amelius 2 hours ago 0 replies      
So sad to see the divide between iOS and Android platforms.

Half of this stuff I can't run.

reader5000 2 hours ago 2 replies      
There isnt really any math to deep learning other than the concept of a derivative which is taught in high school calculus. The reason deep learning papers seem mathy is people take network architectures and various elementary operations on them and try to express them symbolically in latex using summations and indexing-hell. For example the easy concept of "updating all the neurons in one layer based on the neurons in the previous layer and connecting weights" is expressed as matrix-vector multiplication for not really any apparent reason other than it is technically correct and makes for slicker notation, and I guess makes it easier to use APIs that compute gradients for you. Deep learning however is broadly an experimental science, which in many ways is the opposite of math as traditionally envisioned, in which great insights follow deductively from prior great insights. If you ask a basic question like "why should use 4 layers instead of 3?" there is no answer other than "4 works better". Similarly with gradient descent versus random search in weight space. There are many problem domains where random search is as good as any known hill-climbing heuristic search (like gradient descent). Why is GD so effective when learning image classifiers expressed as stacked weight sums? Who knows.
rrggrr 1 hour ago 0 replies      
Gosh I wish this existed for Python.
Switching to the Mutt Email Client nullprogram.com
200 points by ingve  9 hours ago   167 comments top 24
torrent-of-ions 7 hours ago 6 replies      
I started using emacs for email back in my uni days. I used Dovecot to run an IMAP server locally which would get my mail when it could. This made it very quick in emacs because it wouldn't have to wait for any remote server.

But, unfortunately, I had to go back to Thunderbird. With the exception of free software mailing lists, people simply do not know how to use email. Bottom quoting is the default. We can thank Microsoft for that one, I think. I can't understand why anyone thought it was a good idea to attach the previous message to the bottom. Most people just accept this as the way email is and never question how unbelievably stupid it is.

Then you've got all the HTML shit that people put in there. And the fact that many clients are simply broken and don't write the headers correctly which breaks threading. This is often confounded by broken servers which garble the headers a second time.

It's amazing that something as simple as mutt or gnus (emacs) has everything you need: threaded messaging and convenient composition. Inline quoting is surely the only sane way to do quoting. Why would you assume the person you reply to has deleted their own message? My message to you is right there in the same thread in my client. Don't send it back to me. But do make it clear which parts of it you are addressing (inline quoting). You had to laugh when Google reinvented threaded messaging in the 2000s and advertised it as some groundbreaking new thing.

So I gave up. I use a client that can deal with the stupid shit that other clients spit out. I also bottom quote now. Why? Because if you do inline quoting, MS Outhouse web client thinks that everything after the beginning of the quotation is a copy of the previous message and folds the whole thing, meaning your recipients get back what looks like an empty message. Really.

preek 8 hours ago 5 replies      
Interestingly, I went the other way. I used to have mutt for years, then went through a dark phase (macOS Mail.app, Evernote and Things) and finally concluded to Emacs with mu4e.

Integrating mail with Emacs was the best idea ever, because Emails include either todos or serve as reference in projects. Creating todos from Mails as well as linking to them from org-mode is only a shortcut away.

I have never had a more easy to manage and productive todo management system - and it's all text, so there will never be breaking changes or paid upgrades or incompatibilities between tools that I cannot fix.

pyre 43 minutes ago 0 replies      
When I was using mutt, one of the biggest things that I found missing was a way to apply a filter to the entire email without creating some sort of SMTP gateway. I wanted the ability to compose my emails in Markdown, and run the text/plain version through a filter to generate a text/html version. At the time (though I'm sure this hasn't changed), you could only run various parts of the email (e.g. only the text/plain part) through filters, but not the whole email.
oblio 7 hours ago 6 replies      
I feel that the folks that do this aren't working for big companies.

In most big companies you can't really afford to do any email client shenanigans. You probably need to see HTML emails, handle calendar invites, etc.

szastupov 4 hours ago 10 replies      
I spent quiet some time setting up mutt or trying to make a real IDE out of vim or Emacs, but in the end it's just cargo cult and waste of time. There a modern email clients, IDEs, polished operating systems, etc., all built in this century. There is absolutely no need in wasting time scoring nerd points.
MikeTaylor 7 hours ago 13 replies      
This is all very nice, but the fact that the article doesn't even _mention_ images or attachments shows how divorced it is from the reality of most people's email experience.

I happily used emacs as my mail-reader from 1989 to about 2009 (first RMAIL mode, then VM). The increasing importance of non-text formats made it cumbersome, and I somewhat reluctantly switched to GMail. It's hard to imagine switching back now.

fusiongyro 18 minutes ago 0 replies      
I have been using mu4e for three years or so and not had any bugs or glitches to speak of. I haven't experienced any stability issues with package upgrades either.
HurrdurrHodor 5 hours ago 1 reply      
"fiddling feels a lot more productive than it actually is"

Best thing about this article. Unfortunately he didn't derive anything from this wise statement.

stevekemp 8 hours ago 3 replies      
Console based mail-clients are very powerful and fast to use. I was a long-time mutt user before I started writing my own mail client, with complete scripting provided by Lua.

Having a real scripting language allows a lot of useful features, at times using mutt felt a little bit too constrained, although it has a lot of features and is very very flexible in its own right:


jk2323 6 hours ago 1 reply      
I tried Mutt a long time ago but did not like it (may not even got it to work properly).

I always like Pine/Alpine. I only moved to Thunderbird, which is buggy and a resource hog, after I had to handle a dozen email accounts.

Question: How are many POP/IMAP accounts are currently handled under Pine/Alpine? Possible?

pmontra 8 hours ago 2 replies      
I used some good emacs mail client up to 1995, then the need to read Office attachments made me switch to some Windows client. HTML messages were not a thing yet. The switch was a regression in functionality for at least ten years, up to outbound filters in Thunderbird.

Edit: third question, how do you read mail on your phone? A couple of questions for the author. How do you handle attachments and do you download mail over POP3/IMAP or do you have your own server (hence SpamAssassin) ?

k2enemy 4 hours ago 2 replies      
If you like mutt, it is worth checking out neomutt (https://www.neomutt.org). It is mutt with some of the popular patches already compiled in.
discreditable 2 hours ago 0 replies      
Format=flowed is great. I just wish more clients supported itfor sending and receiving. I've felt lately that I would even consider using the Windows Mail app if it supported plaintext and f=f.
juanre 7 hours ago 0 replies      
I switched back to mutt 5 months ago, when my startup started to grow fast and I needed a sensible way to deal with lots of email in multiple accounts. I haven't looked back. I still have gmail as a back-end, but I use mbsync to keep local copies of all my accounts, and msmtp to send mail.

Some resources I found helpful:

- http://stevelosh.com/blog/2012/10/the-homely-mutt/ (good, but it recommends offlineimap I tried it and found it to be slow and buggy. mbsync is much better.)

- https://github.com/cbracken/mutt

- https://lukespear.co.uk/mutt-multiple-accounts-mbsync-notmuc...

angry_octet 3 hours ago 0 replies      
I love mutt and tin (Network News, google it youngsters), and I persisted in sending plain text email for many years, even signed it with pgp. I snickered many times over the years as people got hacked by their own email clients. I sent back 'Meeting Invite' emails to people who couldn't draft a proper meeting request using words.

But ultimately when gmail came along I realised there was a better way. It is only a problem that we give up our privacy and agency by using Google's software and computers. And nobody was checking my PGP signatures anyway.

If you feel like you want to get the benefits of a nice email GUI, in a sandboxed browser, but also want privacy and some control, check out Mailpile:


It isn't finished, but it is promising.

TurboHaskal 7 hours ago 1 reply      
Ugh, no thanks.

I already use enough arcane tools on a daily basis due to my work. For mundane stuff like reading and sending mail, I'd rather take a break and use tools that even my tech illiterate father could effectively use.

It's really detoxicating to become a normie from time to time.

Disclaimer: Used to read mail with emacs back in the day. Tried mutt for a while.

roylez 4 hours ago 0 replies      
I used mutt for 10 years and then moved back to web mail. With inbox zero and gmail keyboard shortcuts, processing email is so easy that mutt is almost an over-kill.
wheresvic1 4 hours ago 0 replies      
I switched to using mutt with Gmail and documented my story here: https://smalldata.tech/blog/2016/09/10/gmail-with-mutt
HugoDaniel 7 hours ago 2 replies      
Very good. I could easily complement this with a calendar terminal app. Anyone knows of a terminal calendar app to keep appointments ? (similar to the apple calendar, or google calendar)
pjc50 7 hours ago 0 replies      
I switched to mutt .. about a dozen years ago, from PINE. I still use it with a pine keybindings file, and I still miss the better addressbook support of pine.

Mind you, that's on a secondary email account; if I need to handle images or attachments I forward it to gmail.

Karunamon 46 minutes ago 0 replies      
One of Mutt's best uses cases in my opinion is for rapid processing of lots of email.

Every now and then I'll go on a spree to zero out my inbox, and I find doing this with mutt's search/tag/move/delete workflow is a lot faster than messing around with my mail provider's application (Fastmail's, which is awesome, and Gmail's, which is less so) or any other client I'm aware of.

erikb 6 hours ago 0 replies      
Headline: Huuuuugely important topic, give me content!

Opening the text: That looks like 5% of what people need to get started. If you could summarize it to such a degree everybody would use mutt.

snakeanus 7 hours ago 0 replies      
I switched to mutt after I was unable to view images with emacs (gnus) as it did not bother (or failed) to un-base64 them before either trying to display them in-line nor when saving the image.
thinkMOAR 6 hours ago 0 replies      
I can't (by now couldn't) imagine running mutt on my mobile to read email.

So i just did to test, and it is horrible! :)

Don't Leave Coredumps on Web Servers hboeck.de
68 points by hannob  11 hours ago   18 comments top 6
sbahra 15 minutes ago 0 replies      
Or use backtrace.io to deal with it. It'll transfer the core dump or a derivative to a centralized server.

Transparency: co-founder.

Xoros 4 hours ago 4 replies      
"PHP used to crash relatively often." ?

In 15 years of hosting hundreds of PHP site, I've never seen a single coredump. Is it just me ?

cpantoga 48 minutes ago 1 reply      
I just want to point out that the article says you should disable coredumps by putting "* soft core 0" in your limits.conf file. You should actually put "* hard core 0" in your limits.conf file. The soft limit is user configurable, meaning applications that have code which changes the ulimit for coredumps will still create coredumps. You can test this yourself by typing "ulimit -c unlimited" in your terminal -- you will not be able to create coredumps of unlimited size. Limits have a soft and hard value. The soft value is the user-configurable max. Meaning, users can reconfigure this. The hard value is the system limit, meaning if you set the hard value to 0 no user will be able to change it.
Terr_ 7 hours ago 0 replies      
If you want to test your coredump configuration for webpages (as opposed to the command-line interpreter) here's a little suicidal script for triggering it on demand:


MatthewWilkes 3 hours ago 0 replies      
I'm a little surprised at the abuse contact step, is that really more effective that emailing security@ and webmaster@ for the domains affected?
MichaelBurge 8 hours ago 3 replies      
You can "Control + \" your application to generate a core dump if you want to test it.
The Use of Subroutines in Programmes (1951) [pdf] laputan.org
7 points by adgasf  2 hours ago   1 comment top
Upvoter33 17 minutes ago 0 replies      
thanks for posting this. it is fun to see historical discussion of such fundamental concepts... Dijkstra's notes also have this flavor.
Chinese scientists test quantum entanglement over unprecedented distance scientificamerican.com
109 points by javascript-this  12 hours ago   54 comments top 9
sliken 9 hours ago 5 replies      
A staple of sci-fi has been a pair of boxes with quantum entangled particles inside allowing FTL communications once the boxes travel (sub-FTL) to their destination.

By my understanding that's impossible, although I always wondered if you could coordinate FTL, by detecting when a particle was no longer entangled.

Can someone with the background comment?

ozy 1 hour ago 0 replies      
Note that while particles can be entangled, the special/useful thing is the postponement of further entanglement (with the rest of the world).
Illniyar 9 hours ago 5 replies      
I thought quantum entanglement is the idea that two quantum particles separated by distance will behave the same at the same time regardless od distance.

Why is sending photons over distances a verification of quantum entanglement?

Can someone elaborate on what this is?

kiliankoe 9 hours ago 3 replies      
As a child I dreamt of having two futuristic walkie-talkies sharing several pairs of entangled particles making perfect instant communication across the galaxy possible. Later my physics teacher crushed my dreams by telling me it's far from as easy as that and probably impossible due to Heisenberg's ... (no clue what it's called in English). It's pretty awesome to hear that something similar is actively being worked on and seems possible, although being a huge challenge.
pulse7 9 hours ago 2 replies      
The interesting part of "quantum communications" is the impact on security and surveillance. Does "quantum communications" even needs switches and routers? Backdoors don't seem to be possible...
fsiefken 6 hours ago 3 replies      
Could you use this for untraceable communication with hidden non-traceable submerged subs? That would be a military advantage.
chibaozi 3 hours ago 0 replies      
we can use quantum tech to break the GFW firewall
EGreg 9 hours ago 2 replies      
Wikipedia says:

"If Bohmian Mechanics as an non-local hidden variables interpretation of quantum mechanics is accurate it should allow quantum computers to implement a search of an N-item database at most in O(N3){\displaystyle O({\sqrt[{3}]{N}})} steps. This is slightly faster than the O(N)O({\sqrt {N}}) steps taken by Grover's algorithm using standard interpretations of quantum mechanics. Neither search method will allow quantum computers to solve NP-Complete problems in polynomial time.[4]"

I thought there was no way to figure out which interpretation of QM is correct. Can someone explain?

splicer 9 hours ago 1 reply      
Imagine being able to explore distant planets with robots and VR googles in real time!
Hard Questions fb.com
81 points by panic  11 hours ago   29 comments top 11
ddellacosta 2 hours ago 1 reply      
No mention of what Facebook's business is anywhere on the page.

Here are some actual hard questions that would probably get you fired from the position of "Vice President for Public Policy and Communications" were you to post them to the official FB blog:

- can a company which makes money as an advertising platform and data broker really protect the privacy of its users when that is directly at odds with its fundamental mission?

- is there value for Facebook in figuring out how to remove fake news or content from terrorists or dead users from its platform vis-a-vis continuing on with business-as-usual?

- is there actually any room for someone within the organization at Facebook to ask questions like this without violating implicit or explicit company norms about how Facebook is discussed, internally or publicly?

opportune 3 hours ago 2 replies      
I expect that the answers to these questions will be "whatever makes facebook the most money without drawing too much public distrust."

This is the company that performed psychological experiments on their customers without informed consent. The same company that worked with the government of Pakistan to EXECUTE a citizen spreading 'blasphemy' on their platform.

Zuck seems to want to start getting into politics (kill me now) so probably the best way to combat facebook's continuous moral disasters is to hold him personally responsible for them.

igk 7 hours ago 3 replies      
Interesting, must admit I am skeptical of the earnesty here (PR...). I am at work so I couldn't flesh it out properly, but I sent a cobbled up list of topics that I would love an official consideration on (hell, if they want I would love to work with them on some of this)

1. Digital selfdetermination. Having the right to not have pictures of you made public (where is a setting that automatically UN-tags me and blurs my face in pictures I don't approve?) as a private person vs. having a right to talk about public individuals, companies, filtering etc. without impedance

2. Following on that, collaboration with law enforcement instead of deleting/censoring.

3. Retroactive denial of usage. This applies more in the EU, but how does facebook plan to address the idea/need to revoke usage of my personal data after it has been used/possibly sold (IIRC you don't sell data directly, but that might have changed since then)?

4. Data takeout: what about making it easy to download and archive all posted content, but also uploaded content related to the individual (tagged pictures, mentions)

5. The status of facebook as a transnational, as "infrastucture" and it's relationship to the democratic and social systems in different countries

6. The ethics of curating and counter curating topics in the feed/related content for powerusers

7. The role of the defacto largest dataset on human communications and the use of that for AI research as well as the impact of AI and automation on society. Combined with that the position facebook would take on possible solutions like UBI

8. The impact of the "highlight reel" on depression and mental health. Studies have shown that heavy facebook use correlates with depression

humanrebar 2 hours ago 0 replies      
> Facebook is where people... form support groups...

Geez. I hope not.

This brings up one a "hard question" we need to consider: discretion.

Many categories of communication are modeled really well by modern technology, including social networks. However, there is a real dearth of discretion-oriented communication available.

Privacy concerns are obvious and being discussed, but we're not really discussing discretion as a healthy part of our lifestyles. Some things need to be shared, but only with particular people or groups. This is absolutely social communication but it's basically unaddressed by any social network I can think of. Group discussions that revolve around substance abuse, significant health issues, survivors of violent crimes, etc. Even less structured conversations with loved ones that are about sensitive issues: health problems, money problems, relationship problems, etc.

And it's worth mentioning that these goals seem to be at odds with the goals of our governments (surveillance) and our social networks (data collection and mining).

Whatsapp (closed source encryption aside) helps a bit for one-on-one communication or permanent groups (like families), but doesn't lend itself to the conversations you might have with a counselor, priest, spouse, or support group.

It may be that "social networks" are entirely for public discourse, but they don't seem to be modeled that way, with "friends", "private" messaging, and so on. In the meantime, we have no real medium for this type of communication and I suspect it's actively hurting our culture and society.

rajathagasthya 6 hours ago 0 replies      
> How can we use data for everyones benefit, without undermining peoples trust?

How about not collecting so much data? I really don't see how my social data benefits anyone except Facebook and advertisers.

blfr 1 hour ago 0 replies      
My guess at the answer is that effectively the public debate will be shaped as a compromise between ad peddlers, spooks, and people who have nothing better to do than complain on Facebook. I'd rather have terrorists in my news feed.
pzone 7 hours ago 0 replies      
Honestly this seems like the best we could hope for, not just from Facebook, but from any company. It seems honest and forthcoming. The conversations we're starting will be going for decades. This page clearly recognizes this.

Anyone who thinks they have the one true answer to any of those bullet points is full of hubris.

idbehold 2 hours ago 0 replies      
The orphaned words in that bulleted list is driving me crazy. Just throw an `&nbsp;` in between the two last words in each line and this will never happen.
deepnet 5 hours ago 2 replies      
Facebook is unprecendentedly widely used, global in reach.

It has become for many an essential service.

Facebook ( in many countries ) is a monopoly, unassailable competitively due to the size of its network.

In days of yore, such a huge monopoly that is a utility would be broken up like Ma Bell in the 1980s or like rail, water and electricity in the UK were nationalised.

Consumer data protections are currently woefully insufficient.

I propose that such vast walled gardens should be curtailed for the public good.

Thus Facebook should be just as viewable without signup, data exportable and deletable in whole or in part.

Facebook should provide an API so competing and extending services can be freely built on it, restoring competition and innovation and niche utilities it brings.

Consumers should be free to use whatever services and programs they choose on their own data whether stored on Facebooks servers or downloaded and held privately.

Huge transnational data monopolies should not be walled gardens.

john_doe_55 7 hours ago 2 replies      
[off topic] I really love to have one of those "Save it for later" or "mark it important" button on SNs, I tend to keep wasting my time digging through endless posts and/or feeds to get what I briefly saw.
Things that Idris improves things over Haskell deque.blog
143 points by deque-blog  11 hours ago   88 comments top 17
unwind 7 hours ago 0 replies      
Admins; please consider editing the title, it has a redundant "things" that makes it hard to read and confusing.

The blog post's real title ("10 things Idris improved over Haskell") is better; unless that's a problem due to being a list (which are often spammy). Seems fine/serious to me, though.

mej10 3 hours ago 5 replies      
Idris points the way to the future of programming.

A few examples that I find very intriguing:

1. type safe printf: https://github.com/mukeshtiwari/Idris/blob/master/Printf.idr

2. compile-time evidence that a runtime check will be performed: https://github.com/idris-lang/Idris-dev/blob/master/libs/bas...

3. Most of all, compile-time checking of state machine properties http://docs.idris-lang.org/en/latest/st/introduction.html

Think about being able to specify protocols in the type system and ensuring that your client and server meet the specifications. The example in that link is about user authentication. Imagine having a proof that the program can only get into a "LoggedIn" state by going through the authentication protocol.

zimbatm 4 hours ago 2 replies      
I am not a fan of `cast` (also seen in other languages as well) as it leads to reductionist thinking.

There are usually more than one way to convert from one type to another. How is Float to Int rounded? Shouldn't String to Int return conversion errors? Just looking at `cast` means I have to learn what the language decided to default to.

insulanian 3 hours ago 2 replies      
How does Idris compare to Agda (1) and F* (2)?

[1] http://wiki.portal.chalmers.se/agda

[2] https://fstar-lang.org/

harveywi 1 hour ago 0 replies      
Frustrated Scala users take note: Idris can compile to JVM bytecode (https://github.com/mmhelloworld/idris-jvm), JavaScript (displacing Scala.js), and native code (displacing Scala Native). Idris may be a very good choice for a post-Scala language.

One thing that I find strange, though, is that some of the most prolific Scala developers who are critical of the language seem to stick to languages such as Haskell/Eta and PureScript. Maybe it's the immaturity of the Idris ecosystem.

logophobia 3 hours ago 0 replies      
So, the unpack/pack solution seems a bit weird to me. Why do you need to convert a string to a list just to iterate? I'm assuming "List" is a linked list.

Why not have an Iterable/Enumerable typeclass/trait/interface which is implemented by each dataset that is listly? Seems a lot more efficient and easier to understand then having to convert between representations just to iterate or change elements.

pvdebbe 8 hours ago 2 replies      
I find it unfortunate how many languages treat strings as lists of chars for no clear benefit. This issue is apparent in dynamically typed languages where a function might expect either a string or a sequence, and now it all has to come down to type comparisons because both can be iterated. I think it'd be time to start treating them as atomic values instead.
rnhmjoj 5 hours ago 4 replies      
The problem with working with strings in Haskell is that there are too many datatypes: Data.Text, Data.Text.Lazy, Data.ByteString Data.ByteString.Char8, Data.ByteString.Lazy, Data.ByteString.Lazy.Char8. All of them share the same function names so you have to do imports like

 import qualified Data.ByteString as BS import qualified Data.Text.Lazy as TL import qualified Data.Text as TS
and somehow the library you need always use a different ByteString variant from the one you already chose so you have to pack/unpack here and there.There should be a way to make `length`, `map` and all work on every string type. Maybe a type class or some better idea is needed.

By the way the link to the caesar cipher is broken.

danidiaz 8 hours ago 1 reply      
An interview with the creator of Idris in the Code Podcast: https://soundcloud.com/podcastcode/edwin-brady-on-dependent-...

Another cool feature of Idris is elaborator reflection https://www.youtube.com/watch?v=pqFgYCdiYz4 which I believe has no direct Haskell analogue (template Haskell perhaps?)

coldtea 7 hours ago 1 reply      
Wait, Idris sounds like a much improved Haskell.

Any downsides (in the core language) besides the smaller community?

Any chances for Haskell to get some of the same things?

grabcocque 6 hours ago 1 reply      
Idris is one of those languages that when I learned it felt like it was offering me a glimpse into a possible future.

I love it when that happens.

nv-vn 2 hours ago 0 replies      
Idris is a really great language and I recommend that anyone struggling with Haskell give it a try. As an OCaml programmer struggling to adjust to Haskellisms, I (ironically) ended up learning Idris before Haskell. Some of the features -- strict evaluation by default, IO evaluation vs. execution, more alternatives to do notation, better records, effects instead of monad transformers -- make Idris vastly easier to understand as a beginner despite the use of dependent types. As a language, Idris is still lacking a couple of things I'd like (and there's still plenty of bugs), but it definitely feels like it's a much refined version of Haskell.
alkonaut 4 hours ago 2 replies      
I know idris (apart from improving some haskell warts) also adds dependent types. Are there any good simple examples of dependent type use, that isnot vectors-of-length-N?
MichaelBurge 8 hours ago 4 replies      
Many of these you can work around with language extensions or a custom Prelude, but then you need to have been bitten by them to know that you need to work around them.

I hear "dependent types" and I think "theorem prover", but it seems like Idris is a cleaned up Haskell with some light proving features built in?

Haskell is good for compilers and parsers. What's a good excuse to try Idris?

mijoharas 6 hours ago 1 reply      
Can anyone explain the last point to me? How exactly is the example considered "abuse"? What would an attempt to write similar in Haskell look like?
alphonse23 6 hours ago 0 replies      
Was this written in a hurry? There's lot of spelling/grammar errors in the article.
Kiro 5 hours ago 0 replies      
What is the problem with strings being lists in Haskell?
Memristor The missing circuit element (1971) [pdf] cpmt.org
68 points by dayve  15 hours ago   16 comments top 5
smalley 13 hours ago 1 reply      
Oh man, if you like that paper do I have the list for you. We used this list while doing some literature review when we were doing research on HPs memristors http://webee.technion.ac.il/people/skva/new_full_reference_l.... That covers a pretty wide range of topics related to memristors
danmaz74 6 hours ago 1 reply      
Anybody knows what happened to to HP's "memristor"? Was it only vaporware?
throw_away 9 hours ago 0 replies      
Fun fact: Leon Chua is Amy Chua's (of Tiger Mom fame) father.
exabrial 10 hours ago 1 reply      
Ok, I get that capacitors and inductors are opposites, but I've yet to understand the significance of why the memristor is the opposite of a resistor... Chalk it up to my primitive knowledge of analog design.

Couldn't theoretically there be a mempacitor and a memductor as well?

Black-Plaid 9 hours ago 0 replies      
Here's a nice talk about memristors:


Open Source Datasets deepmind.com
325 points by amberj  19 hours ago   21 comments top 7
Alexqw85 43 minutes ago 0 replies      
The lab I work in publishes, and has continues to extend, the studyforrest dataset for quite a few years now.


Most of the consumers so far have been neuroscience researchers and statisticians, but we do hope (and think) that there's value for a wide variety of interests.

There's a bunch of different data, but the highlights are fMRI scans of people watching and/or listening to the movie Forrest Gump, eye tracking, and detailed annotations of the movie. We are also about to begin acquiring simultaneous EEG and fMRI.


Accessing the data is easy, and, as great admirers of Joey Hess, we also have it available in a git annex repo. :-)



[EDIT] Given that this thread is about open source datasets, it's probably worth mentioning that the license is PDDL.


zitterbewegung 15 hours ago 2 replies      
I have been thinking about what this solves in respect to other datasets. Nearly all shape recognition datasets have a restriction that you can't use unless you are an academic. I feel like that Open Sourcing data sets will allow us to be more democratic with data and the things that are generated by them. Creative Commons seems like a good license for this though. Once you have the data is half the battle . The rest is to make open models (google is good at this) and then you could take pretrained models and not have your data leave your house . I hope and dream we can do this.
Kpourdeilami 17 hours ago 4 replies      
Somewhat unrelated: Deepmind's website is so cluttered and distracting to the extent that it is almost unusable
ptero 3 hours ago 1 reply      
One question that is not clear to me is what should the dataset license to allow / restrict, in the perfect world. For me (just a personal opinion) it would allow free (as in liberty) use, but somehow encourage those who use it to share the benefits (data, software or algorithms) under the same license.

Unfortunately, Open Source does not help here -- I do not see how OS can be used with data sets. The main OS leverage with software development is that if you use software X to build software Y, X is usually present in some way, shape or form in your deliverable Y. Not so with training data -- once algorithm development is done you can (and usually do) strip training data out and have a finished product that does not require X to run.

Even if one were to require open sourcing derived datasets it is usually easy to segregate the dataset with a tainted (open source) license as you build up your data so the new datasets are not formally "derived" and thus would not need open sourcing.

I would love a better way forward on this, or at least a cleaner explanation of options.

iandev 13 hours ago 1 reply      
Forgive my ignorance, but I'm not sure for what a dataset like the "Collectible Card Game to Code"[0] might be used. Can anyone explain how and for what it might be used?

[0] https://github.com/deepmind/card2code

deepnet 5 hours ago 0 replies      
Dear Deepmind, as you have retired AlphaGo please open source the dataset of Go games used to train it.
blazespin 12 hours ago 0 replies      
Cool, when deep mind originally joined google it was on the condition that google would be moral about its use of AI.
A Startup Making Paper Out of Stone, Not Trees bloomberg.com
243 points by T-A  15 hours ago   154 comments top 35
soupbowl 15 hours ago 12 replies      
Just a note: paper is either made of dead standing trees or wood that is so full of rot/defect that no lumber can be made from it. It is a common misconception that live green trees and whole forests are destroyed for paper.

- a former woodsman for 13 years

Edit: My experience is in western Canada. I forgot that hackernews is a global site. Some countries do grow pine specifically for paper.

philipkglass 14 hours ago 5 replies      
To make a ton of regular paper requires 100 tons of water, TBM says, while its Limex paper is made without water. In place of 20 trees, it uses less than a ton of limestone, as well as 200 kilograms of polyolefin.


"Making paper from wood chips involves planting trees, which can be carbon neutral, so Im not sure how much appeal this will have" from an environmental perspective.

Appeal from an environmental perspective: zero. Less than zero. Polyolefins are plastics. The vast majority are made from fossils. Polyolefins can be made starting from biomass but then so is ordinary paper. It's baffling if the inventor or his customers think that he's improved on the traditional environmental tradeoffs associated with paper production.

everyone 11 hours ago 3 replies      
Isn't hemp one of the most ecologically sound materials to make paper from?

random linkhttp://www.hemphasis.net/Paper/paper_files/hempvtree.htm

And one of the reasons marijuana was originally prohibited in the US was due to lobbying from wood-based paper companies.

dpark 39 minutes ago 0 replies      
So instead of carbon neutral pulp paper that can be recycled and biodegrades, this paper is 1/5 plastic, doesn't biodegrade, and probably ruins the batch if mixed with pulp-based paper for recycling.

The water savings are good, but this doesn't seem like a net gain.

failrate 14 hours ago 2 replies      
Buried in the article is that the binder is polyolefin, which basically just means plastic. So, this is plastic paper. Granted, the examples given were for semi durable items like menus and business cards, but i still see that as a net negative.
lazyjones 5 hours ago 0 replies      
I see 2 problems with this:

* the other ingredient is polyolefin resins - unless they are obtained from recycled plastic bags or similar, this probably makes Limex worse than normal paper (how much water is involved, we do't know for sure).

* what happens at the product's EOL? Will it end up together with recycled paper and cause problems in that process? Will it have to be recycled separately and if yes, who's going to be able to tell these types of paper apart?

tray5 10 hours ago 0 replies      
I've got a notebook of stone paper that I bought from a local office supplier which I use for general note taking. It's not the brand in the article however. Writing on stone paper feels quite amazing. The pen just glides over the paper, and the paper feels very high quality. It's waterproof and doesn't tear like normal paper (when you pull it apart it pulls like a flimsy elastic plastic that eventually rips). I thoroughly enjoy using it.
iamatworknow 1 hour ago 0 replies      
I've been using these "Ogami Collection" notebooks apparently made from limestone for a while now: https://origin68.com/collections/notebooks-stationery/produc...

They're like much more durable Moleskins, in my experience.

tiplus 8 hours ago 0 replies      
I am worried about the dust this type of paper might produce when it is shredded or ripped, similar to the dust from rockwool insulation materials made from stone fibres. If I remember correctly, the persistence time of rockwool dust (current generation not 1970s) in your lungs is about 4 weeks, during which it is /may be cancerogenic?
Overtonwindow 13 hours ago 1 reply      
I'm not sure about this. Trees are sustainable and renewable. Rocks are not. This is why mining can be so devastating to an area
Animats 9 hours ago 1 reply      
Paper is a declining consumer of wood. "Peak paper" was years ago, after print newspapers tanked. There are many abandoned paper mills, if you want one. It's possible to make paper from rice hulls (works fine, and common in China, but a bit hard as a writing surface), kenaf (works, but nobody bothers), and hemp (niche product for potheads).

It's just not a problem.

Inconel 8 hours ago 0 replies      
Years ago, when I got the original Nexus One smartphone, I'm fairly certain the box it came in contained a small card with instructions on it that was made from limestone. I can't remember the name of the company responsible but I remember it being printed on the card and that I checked out their website since I was intrigued. I never saw those non-paper cards in any subsequent Nexus packages.

Edit: I wonder why Google stopped using them. Or maybe they didn't and I just never noticed since they were so paper-like.

george_ciobanu 7 hours ago 1 reply      
Trees aside, this is not a new concept, stone paper notebooks have been around for a while. The ones I had wrote incredibly smooth and I loved every one of them. I think Walgreens still carries a small one.


isaac_is_goat 14 hours ago 2 replies      
Silliness. Paper is a sustainable, and easily renewable resource...we don't need to find alternatives if we just keep planting trees.
bmcusick 3 hours ago 0 replies      
I wonder what the archival properties of this paper are like. Is it fully inert? That would be useful.
oxplot 12 hours ago 1 reply      
There doesn't seem to be mention of few existing companies that have been producing notebooks and other products from limestone, for years now. One example:

* http://nuco-direct.com/nu-stone/

needlessly 38 minutes ago 0 replies      
paper? What is this 1995?
ipsum2 14 hours ago 1 reply      
Would this be considered a business, not a startup? I don't see major disruption happening in this space. Lots of companies make non-wood paper, e.g. https://en.wikipedia.org/wiki/Stone_paper and plastic.
chefandy 12 hours ago 0 replies      
I have a notebook made from stone paper. I use it in my kitchen because water won't kill it and I can write on it in sharpie without it bleeding, even onto the opposite side of the page. This is not new.
saagarjha 13 hours ago 1 reply      
Quick question: what are the properties of this paper? Can I burn it? Fold it? Eat it?
Nursie 5 hours ago 1 reply      
Stone.... and polyolefins. This somewhat pollutes the message. Is it anywhere near as recyclable or biodegradable as 'normal' paper?
hkmurakami 8 hours ago 0 replies      
This seems way less sustainable than pulp based paper. What's the appeal?
nottorp 7 hours ago 0 replies      
I'm assuming that besides it being waterproof, it's also fireproof? That would make it interesting.

Other than that, limestone isn't renewable while trees are so...

MPSimmons 3 hours ago 0 replies      
Paper sequesters carbon. Why is using stone a good thing here?
dguo 12 hours ago 3 replies      
"He says its the answer to concerns over deforestation and water shortages, with world demand for paper set to double by 2030."

I'm very curious as to what is expected to be driving this increase in demand. If anything, I would expect electronics to make paper more and more obsolete. But what am I missing?

elicash 10 hours ago 0 replies      
You can view the end result, including when they attempt to tear it, in this video: https://www.youtube.com/watch?v=KchJZKoT16w
dwenzek 8 hours ago 0 replies      
And this startup aims to replace concrete by a wood based material.


med_abidi 3 hours ago 0 replies      
I guess this is what it mean to be written in stone.
bronz 9 hours ago 2 replies      
why is there no effort to make artificial tree mass? this could be done by creating genetically engineered trees that use as little resources as possible to grow as much wood as possible in lab environments. one could even go as far as powering the cells electrically -- it is theoretically possible and has been done in the lab before. or, why is there no effort to synthesize wood itself chemically, industrially?
johngrefe 13 hours ago 1 reply      
I wonder what takes more CO2 output. Mining stone, or growing trees and processing them in paper farms?


jcwayne 2 hours ago 0 replies      
> world demand for paper set to double by 2030

We have failed.

jonah 9 hours ago 0 replies      
There used to be a company called EarthShell which made biodegradable food packaging and other containers out of limestone and other stuff. It went public in 1998[0], filed for bankruptcy protection a decade ago[1], and their twitter account was last updated in 2010[2].

I wish TBM better luck.

[0] https://secure.marketwatch.com/story/earthshell-begins-to-cr...

[1] "After losing more than $331 million during its 14 years of operations, EarthShell Corp. has filed for Chapter 11 bankruptcy protection." - http://www.bizjournals.com/baltimore/stories/2007/01/29/stor... (paywalled)

[2] https://twitter.com/EarthShell

Interesting trivia: the founder and chairman, Essam Khashoggi, is the brother of Adnan Khashoggi - a central figure in the Iran-Contra affair.

desireco42 12 hours ago 0 replies      
I had one of the most beautiful writing experiences with pencil on stone paper, it is amazing (Field Notes).

On the other hand, it is not for fountain pens. I think it is specialty paper that is very useful but will not replace regular paper and also should not be sold as ecological alternative.

id122015 9 hours ago 0 replies      
Going back to Flinstone age ?
ccvannorman 2 hours ago 0 replies      
FINALLY, I'll be able to take notes while scuba-diving.
Atom 1.18 released atom.io
117 points by aditya42  12 hours ago   110 comments top 10
ainar-g 5 hours ago 20 replies      
Please, don't consider this a troll question (I personally am a vim kind of guy), but is there any reason to use Atom over VSCode these days?

The sole reason to me seems that VSCode is developed by the "evil" Microsoft. Other than that, most people I know who tried Atom switched to VSCode or (back to) Sublime in the end.

ssijak 3 hours ago 3 replies      
I really wanted Atom to work. But in the end, it is just to damn slow, takes too much memory, and hogs on large files.Now, I find vscode to be very much what I need for javascript/typescript/html/css work.
atiredturte 1 hour ago 1 reply      
I used to love Atom, the plugins were perfect and I could customise it to do exactly what I wanted. However, even on a fully specced macbook air, it just became too slow. I moved to VS Code because it had the speed I needed with the customisability that I wanted. Hopefully this Atom update increases performance though, I really liked that editor :)
astrod 1 hour ago 1 reply      
I recently switched from vs code to atom, before that I was on webstorm.

Loving the hydrogen plugin to evaluate lines or js expressions inside the editor. https://atom.io/packages/hydrogen

ericfrederich 3 hours ago 0 replies      
Wonder if it changes anything significanly. It would really have to to change my opinion.It always felt very slow to me even on killer hardware.

This page quantifies it.https://github.com/jhallen/joes-sandbox/tree/master/editor-p...

I don't so much care about memory consumption. It can be 5 to 10 times as much as Sublime... I am RAM to spare. What I do care about is that a lot of these tests are 20 times SLOWER than Sublime (which can already be considered a heavy-weight editor)

gri3v3r 4 hours ago 4 replies      
I prefer Notepad++. On Linux, Geany is a viable alternative.... If I need an IDE, I will get an IDE.I want my text editor to be as lightweight as possible.This was not the case with Atom.
kalendos 6 hours ago 0 replies      
Also, Atom 1.19 Beta has native text buffer. This should reduce memory usage for large files. Will try it out!
baby 4 hours ago 1 reply      
This is the 5th time I try Atom (after switching back to sublime) but I might start using this a lot more now that I've discovered go-plus. Atom might really be the best text editor for golang.
nickrio 5 hours ago 1 reply      
I don't know. Since it's basically build upon ECMAScript, HTML and Webkit already, why not also wasm?
jokoon 4 hours ago 0 replies      
Insert another rant about how atom is bloated compared to editors like sublime text, notepad++, and VScode (which I discovered recently and seems to be a clone of sublime text).
A Sociology of the Smartphone longreads.com
54 points by anjalik  14 hours ago   13 comments top 3
pavement 10 hours ago 3 replies      
Technological advances don't seem to have an undo button, as far as I can tell. Once a technology of utility is unleashed, things keep advancing and the genie never goes back into the bottle.

I suppose technology can go backwards, but only by way of generation-spanning disaster. The fall of Rome is an obvious example. I guess this means we'll be dealing with this constant technological overload for the rest of our natural lives, huh?

politician 10 hours ago 1 reply      
Wouldn't it be a treat if this article were presented to prospective buyers of smartphones rather than a bland ToS?

Sign right here to strike this Faustian bargain.

(Disclosure: I own and operate a smartphone.)

djsumdog 8 hours ago 3 replies      
There's a lot wrong with this article. Do people actually use their phones for transit? I have transit cards from like 10 cities, and transit cards don't run out of batteries. Cards/cash don't run out of batteries either, and you can use them to refill your transit card, and I don't see those getting replaced any time soon. (Germans still use cash for like everything, which feels weird in an EU country).

And we still have plenty of phone booths ... if you need to take a piss. I'm pretty sure that's the only thing they're used for in London. Seriously don't touch those; they're disgusting.

Do people actually meet legit partners and friends from apps? I've never met anyone off Tinder (I'm not all that cute really), dating websites are a wasteland and all the people I've had meaningful relationships I've met at things I like to do. Sure we have meetups, but they really just facilitate what we use to do with flyers/posters before. Meeting people still takes work. No one meets of Internet chat rooms like in the AOL/Seinfeld era. The Facebook generation of social networks wants you to only connect with people you already know.

Smart phones, cell phones, affordable laptops, the Internet, television, radio, magazines, news and print media ...

Every advancement is huge. Every advancement initially supplied free and open speech. Printing presses pre-civil war were less than $10k USD to setup (in today's money). Papers crossed the country and spread news and opinions. Large media, advertising, the pony express (incredibly expensive but necessary for consolidating news) caused the price of starting a viable news print service to over $10 million in today money, by the 1930s. Same thing happened to Radio...


Are phones a much bigger revolution than many of the previous ones? Yes. It's more power than a Pentium4 laptop, in your pocket, and always connected. But it wasn't unpredictable. Engineers at MIT in the 90s/2000s had bulky wearable with eye monitors and small hand keyboard (we still haven't see an viable version of this concept, like the glasses in Back to the Future, except maybe the Google Glass). Old Sci-Fi books have people typing messages on their wrist computers.

It's a big deal yes, but a lot of what we see today is "get off my lawn" and "those dumb [millennials, boombers, gen-x] are so entitled." Adam Conover does a GREAT commentary on the problem with generation labels: https://www.youtube.com/watch?v=-HFwok9SlQQ

..and he says that we try to blame the media (radio, televisions, phones) and all the same things we say about phones today we could see in magazines applied to TVs decades ago. It's not that media is changing things, it's just more media.

Look back at videos from the 1960s in big cities. People still didn't talk to each other on the trams, the subways, the rails .. they had out their newspapers and their magazines. Jump to the 1980s and the nerdy kids and adults had their walkmans. Jump to the 90s and everyone had a walkman or discman (that ate batteries).

Today phones have replaced a lot, but they have added to the ability to be minimalist. They need to be built to last longer (I would rather keep a phone for 5 years .. planned or negligent obsolescence is insanely wasteful. Most of those old walkmans on eBay still work), but I don't think they've changed things as significantly as the author suggested.

We don't look in at TVs in store front windows to watch the news, but we still watch sports together in bars. We're still getting news from huge monolithic sources, but now it's in our hand instead of on the news stand. We no longer have to wait in a parking lot and be like, "Where is Nick?" "I just called his house on that payphone. His mom said he left like 30 min ago." "arg I wish he'd get here. I wanted to start at 9am!" .. we get lost less, we organize people better, we no longer have to send mail to "general delivery" at a post office if we don't know someone's address.

We have all this communication, but futurists predicted these types of advancements would come. The privacy aspects .. yes those are frightening. Yet it's just an extension of what we were doing in the 80s with the earliest credit reporting and financial computers (See Adam Curtis's Hypernormalization).

I don't think articles like this do us a service because they're not presenting the advancements through a meaningful lens.

Garry Kasparov: Why the world should embrace AI [video] bbc.com
59 points by Nikiforos79  3 hours ago   47 comments top 11
jasode 2 hours ago 3 replies      
I haven't read Kasparov's book and the short article doesn't really dive deep into it so I can only comment on the BBC soundbite...

If people are "pushing back against AI", it's not the progress of technology they're against -- it's the economic consequences. People are worried about joblessness and no financial security for retirement.

It's similar to saying "embrace outsourcing because you get cheaper products" or "embrace H1B because America was built on immigrants and their skills".

You can't just speak of those aspirations as general platitudes without being aware of what the real worries are. People aren't xenophobic -- they just want to keep their livelihoods.

If you don't address the commoner's concerns, you'll be perceived as disingenuous.

opportune 2 hours ago 3 replies      
More pop AI garbage. As a rule, pretty much any article about AI from someone outside of the field of AI or CS in general is going to full of these vacuous, overly broad arguments. Stephen Hawking is guilty of this too. Basically any mention of "AI" and "dystopian" in the same sentence is a red flag for this kind of stuff. I wish that as a website we could just ignore these articles.

No, BBC. Gary Kasparov is not an AI authority. Please don't treat him like one.

EternalData 1 hour ago 0 replies      
I saw Garry speak in San Francisco -- I'm convinced he's an intellectual giant, but he was very candid in admitting that for all of his trained knowledge in chess, once you take that to an adjacent challenge (the game of Go), he struggles with the basics. I think the same logic should apply with his understanding of AI.

I do think there is a general phenomenon of "smart people outside of AI saying dumb things about AI" -- I'm guessing it's because of the massive implications artificial intelligence is going to have on every part of society.

forgot-my-pw 13 minutes ago 0 replies      
He recently did a Talks at Google with the CEO of Deepmind: https://www.youtube.com/watch?v=zhkTHkIZJEc

It mostly covers his experiences with Deep Blue. Wish it was a longer talk. He generally has pretty positive view on computer and AI.

One interesting bit is when Demis Hassabis questioned if a chess-trained AlphaGo can be stronger at chess against Stockfish. Would be fun if they really try it.

PS: Hassabis once reached master standard at the age of 13 with an Elo rating of 2300 (at the time the second highest rated player in the world Under-14 after Judit Polgr who had a rating of 2335) and captained many of the England junior chess teams.

petters 3 hours ago 2 replies      
A few years ago, he wrote an article explaining that computers could never be good at poker, because that required bluffing, a skill exclusive to humans.
grondilu 3 hours ago 2 replies      
"[...] machines have algorithms, and they're getting better and better, but machines have no curiosity, no passion, and most importantly, machines don't have purpose."

IMHO the greatest uncertainty regarding AI, and in that sense the greatest risk, is what happens precisely when AI becomes so good that it can be made to have all those features. Intelligence probably is the toughest part of a more global objective that is to emulate whatever the human body and brain can do.

A world where machines can do whatever the human body does is vastly different from ours. It's hard to even imagine it, even though some authors are trying [1]. And some of the possibilities include the end of our biological lineage.

1. http://ageofem.com/

kushti 2 hours ago 1 reply      
Interestingly, Kasparov is considered as a great intellectual in the West, while in Russian-speaking world he is known for support of some utterly controversial theories, e.g. Nosovsky-Fomenko pseudo-historical bunk (see, for example, https://www.youtube.com/watch?v=4Thfip4Owz8 [RUS]).
peculiarbird 3 hours ago 9 replies      
As one of the first humans to have their job outperformed by a computer I feel Gary has taken it well in stride. I suspect the rest won't be so gracious.
cJ0th 2 hours ago 1 reply      
When they list the fears they forgot an important one: discrimination.

Of course, discrimination has always been a problem but I'd like to believe we see some progress in "classical areas" (for instance sexism). The problem with AI is that it is always pigeonholing. As a result myriads of new classes of minorities which are so small that they neither have a name nor a voice will emerge. For instance, you get rated for creditworthiness and somehow you're not typical re: attributes x,y and z. You get a bad rating but you can't really complain as the algorithm probably didn't take things like sex or race (directly) into account. However, it could be that your FB posts are enjoying above-average amounts of likes from people with low education and this may raise red flags etc...

ivanb 3 hours ago 2 replies      
Why would anyone listen to him? He doesn't even have neither economical nor technical education. From what I know about him, he just knows how to play one game really well. Basically a sportsman. Why would I listen to a sportsman on an economical topic that is greatly affecting society?
HONEST_ANNIE 2 hours ago 0 replies      
Kasparov says that machines don't have purpose, passion or curiosity. This is typical expression of meat machine privilege.

The purpose of a system is what it does (POSIWID). Feeling purposeful and not feeling purpose is just utility function giving feedback.

Is passion anything else than utility function configured so that the machine does not wander aimlessly and idle? If the machine has lots of freedom to choose its actions and it chooses to avoid 'side quests' and prefers one issue over all others, does it have passion?

How about curiosity? Consider phenomenon that is hard to predict and the machine can't recognize it. Is it curious if it has been programmed to investigate, poke around and learn. When boosting algorithm increases the weight for incorrectly classified instances, isn't that a primitive form of curiosity?

Kasparov also says that AI does not make us obsolete. I assume that obsolete means that if meat unit is removed from it's environment, others (society) don't miss it's contribution. What does 'us' mean? I interpret that as 'AI does not make all of use obsolete.' This is most likely true since for foreseeable future. If not for any other reason than comparative advantage. The oversupply of meat machines has already made many of them undesirable (overpopulation).

Ableton Live Redesign nenadmilosevic.co
282 points by nndmlsvc  19 hours ago   81 comments top 36
roldie 3 hours ago 1 reply      
Unrelated to the quality of the redesign. I would advise other designers to not follow this method exactly.

Talking to users is great, it's what separates a designer from someone who just pushes pixels, but it's not a good idea to ask users what they like or what they want.

"Does ____ look like a helpful feature to you?" "Does ____ work for you?" "Is this better than before?"

What's the response going to be? "Sure, sounds good to me". Or "nah, I hate that kind of thing".

What you should do instead is observe user behavior. Watch how they interact with the tool/site/etc. and what they do. What are they trying to click on, what are they looking for, what is confusing. Ask them to talk out loud and share their thought process as they're going through the design.

Granted, the designer here may not have had time or access to do this, but then the questions should be more along the lines of what goals the users are trying to accomplish, and what helps or hinders them.

There's the old adage attributed to Henry Ford, "If I had asked people what they wanted, they would have said faster horses." Through observation, or task/goal-oriented questions, discover what people need instead, and design for that. My explanation here is very simplistic, but I hope it gets the idea across.

Source: Am designer, and review portfolios and conduct interviews. We look at process as much as we look at quality of output.

bjt 18 hours ago 4 replies      
I like that he followed a process of gathering suggestions from real users and then testing his solutions with them. It's a big step above the typical "I redesigned X!" posts that look really pretty but probably aren't very usable.

I wonder what kind of job one would expect to get from this. These tasks (gathering user feedback, doing visual designs) are probably divided between several people at Ableton already. Does he want to replace the whole team? Is he asking to be their boss? Would he be OK with a junior position that lacked authority to make all the changes he's proposing here?

aguynamedben 11 hours ago 0 replies      
An Ableton thread made it to the front page of HN?! YES. Awesome work. +100 on the expanded dedicated mixer view. I like Live because it's so music and the Push 2 is epic, not because I make electronic music.

I noticed metering was on the survey. I really wish that within a device rack (along the bottom) the metering between devices had settings that let you see Peak, RMS, and a numbered meter. You could do it all with lines... one meter for RMS, peaks that sit at the top, and tick marks for 0, -6, -12, -18. Most people don't fully appreciate gain staging and it's so important. I use a lot of vintage Waves plugins (CLA-76, LA-2A, API 550/2500) and I'm always having to insert Klanghelm meters, Utility effects, or observe the In/Out reading on those plugins to make sure I'm driving them right. The device area becomes messy just because a simple lack of metering and gain staging.

Sadly, I think most EDM music makers don't understand this. Maybe I'm stuck in the acoustic 70s using these vintage-style plugins. =)

My other beef is lack of Arrangement View features on Push 2, but alas, this isn't a hardware redesign. =) NICE WORK.

anonova 15 hours ago 11 replies      
I don't really have a problem with the UI; I want Ableton to fix the other annoyances first:

* Saving the project resets the undo history. A someone who frequently saves, this is super frustrating.

* No normal zoom controls. You have to use the "minimap" at the top.

* Only a single project can be open at once, which makes copying from old projects tedious unless you shuffle files around into your current project.

* Tracks can only be grouped one level deep, i.e., no subgrouping. Most DAWs give you arbitrary grouping depths.

* No way to overlay multiple tracks in the piano roll. I end up starting in one group and extracting to individual buses afterward. This doesn't work well the other way around.

My biggest wish for the next version is project versioning. It's an absolute nightmare managing so many save "checkpoints", especially, as I mentioned, you can only have a single project open at once.

betageek 6 hours ago 1 reply      
If I was a UX student looking for something to redesign Ableton Live is one of the last programs I'd ever look at - it's a classic of UI design. If you want to show off your UX chops, choose something that's broken e.g. 95% of the software in the world.
moxious 15 hours ago 1 reply      
Best. Job application letter. EVER. When I'm hiring I'd love to see this kind of passion and attention to detail. I hope he gets a call from Abelton
teilo 18 hours ago 1 reply      
Your design is very nice, and has much to recommend it. But in my opinion, it kills the advantage Live has over many of its competitors. It takes up too much space, and is too busy.

Live's UI is busy, but not as busy as many other DAWs. It is just enough to get the job done without using up screen real-estate needlessly. This is one of its many strengths.

Electronic music production needs efficient workflow. A utilitarian, but efficient UI wins over a better-looking but less efficient UI. This is why Live hides so much functionality in small buttons which toggle additional sections of the UI. Yes, it may not be obvious to someone coming into Live cold. But it makes working with the enormous complexity of a professional DAW a breeze once you learn where all the bits and pieces are, and lets you git the bits you don't presently care about out of the way so you can focus on the current task.

Jemm 1 hour ago 0 replies      
Please consider people who don't have perfect vision.

Grey text on a grey background is just not high contrast enough to be accessible.

lioeters 18 hours ago 1 reply      
Truly enjoyed this re-imagining of Ableton's user interface, with sketches, detailed diagrams, and thoughts behind each design decision. It was educational to see the process of improving visual organization for such complex software with hundreds of controls.

To implement this design, I imagine it would take a lot more than switching color schemes. I don't know anything about the internals of Ableton, but it would have to be made very modular, with public low-level APIs..

NamTaf 3 hours ago 0 replies      
Just a minor side note and I don't know if it's just something you did during the data display or if it is actually reflective of how you surveyed, but if you're surveying for comparitive purposes you probably should keep the descriptors and options as similar as possible (ideally identical). Having multiple different 'yes' and 'no' variations can introduce subtle distortions to the data.

If your goal is to actually deal with users and their feedback, this is probably more important than it is for anyone who just wants to show their design clout.

fancyPantsZero 18 hours ago 0 replies      
Ableton, hire this guy!

I like how you treated track groups. That's definitely one of the biggest missing features for me.

I also really like the level-of-detail sliders, what a cool concept, would really add lots of flexibility to the UI.

nkozyra 14 hours ago 0 replies      
First: what a phenomenal approach to a company. Shows a personal connection to the company and product that I'd bet is lacking in some current employees.

Second: I like the design but: is it just me or does this look a whole lot like Logic?

franciscop 4 hours ago 0 replies      
Really awesome project and research. I haven't used any music software for ages, but I think it'd be awesome if the author talked with some open source similar projects and actually got an interface implemented in some system.
sideshowb 17 hours ago 1 reply      
I hope this guy gets the job he's after! But imo the biggest thing missing from the ableton ui these days is touch support, and this design doesn't address it.
hit8run 7 hours ago 0 replies      
Big Ableton Fan here and using it for more than 5 years, even though recently I purchased Logic Pro X too. I like that he added a scale feature to Ableton in his draft (one thing I love about Logic). To me this theme is not so easy on the eye and lacking contrast.

I am currently using this theme by PureAV (UI designer of Serum Plugin) http://pureav.deviantart.com/art/Ableton-Live-9-Skin-5893383... but it also is not too easy on the eyes and sometimes I switch back to the default one.

Concerning UI language I also like this Ableton Redesign Concept:https://dribbble.com/shots/2255100-Ableton-Live-Redesign-Con...

MrScruff 8 hours ago 1 reply      
I'm really not interested in a flat UI from Ableton, this looks cluttered to me. What I want is proper modulation system, rather than the hacks you currently have to go through with Max for Live to achieve a result.
pycal 5 hours ago 0 replies      
Nice job! I like how easy it is to see the metering between devices on your effects chain
cdevs 3 hours ago 0 replies      
I miss my ableton and fruityloops experimenting days I'm sure people who get to play with this stuff everyday love their job.
Shinchy 5 hours ago 0 replies      
But what will the arrangement window look like. That is where I get most of my issues from when using Ableton (and I use it a lot). Really Glad to see that newer plug-in setup though - very similar to one of the things I love in ProTools.
amelius 2 hours ago 0 replies      
This UI looks like the control panel of a commercial aircraft.

Not sure if that's a good sign or not.

icanhackit 13 hours ago 0 replies      
Nenad great design man. You really nailed the colors and I dig the flat UI, similar to what you'd see in a RAW editor like RAW Therapee or Lightroom. It's consistent, professional and carries a certain logic through all of the elements. Hope you get the gig!
komali2 17 hours ago 0 replies      
This is one hell of a cover letter.
codazoda 11 hours ago 2 replies      
I'm curious where the list of users to survey came from. For the company, this should be easy, but for a guy making a cover letter, probably much more difficult.
JDiculous 15 hours ago 2 replies      
The worst UI flaw of Ableton is its atrocious piano roll. They need to make it more like FL Studio's piano roll - left click to enter a note and hold it down to adjust length, right click to delete (yes there's draw mode, but it doesn't let you do that)
fortyfivan 16 hours ago 0 replies      
Great work! Tough to tell if I would _really_ like it without spending some time in use, but the one thing that really jumped out at me was the various Clip types and states. Very nice.
nndmlsvc 9 hours ago 1 reply      
Hey guys, thanks for all the comments! If my site is down please head out to medium https://medium.com/@nndmlsvc/ableton-live-redesign-26efebe73... or behance https://www.behance.net/gallery/53789531/Ableton-Live-Redesi...


rhizome 14 hours ago 0 replies      
Just a point of quibble: are 16:10 laptops or displays that prevalent now? The usefulness of this would go way down at 16:9.
ziikutv 16 hours ago 2 replies      
Sadly the website is down and Wayback Machine did not save the picture. I am really unhappy because I was really looking forward to be amazed.

Edit: Found a sneak peek on Nenad's Dribbble: https://dribbble.com/shots/3564223-Ableton-Live-Redesign

peapicker 17 hours ago 0 replies      
Really don't like the loss of the majority of the clip color into tiny little bars on the side of the clip. It may look prettier, but it is far less functional -especially for those who use Live "live" in addition to studio production.
exabrial 13 hours ago 0 replies      
<3 Abelton, even for it's flaws. Absolutely an essential part of making live music these days.
the5who 7 hours ago 0 replies      
Very cool stuff.. Support for Nenad.
kampsduac 13 hours ago 0 replies      
Nice work, hope you get the job!
the5who 7 hours ago 0 replies      
Very cool stuff. Support for Nenad.
brandonmenc 14 hours ago 0 replies      
A+ for effort, but this re-design looks way cluttered and busy compared to the current Ableton UI.
exabrial 13 hours ago 0 replies      
Request: please do a mainstage redesign!
poisonarena 12 hours ago 0 replies      
Burning Man OS
Is SHA-3 slow? noekeon.org
93 points by snakeanus  11 hours ago   68 comments top 13
Asdfbla 7 hours ago 3 replies      
My takeaway from this post kind of is that there are probably too many variants and derivatives of SHA-3. Then again, these days normal developers should probably use some recommended interface of mature crypto libraries anyway and shouldn't play around with cryptographic primitives directly, so maybe more choice for the library devs is good.
lucb1e 10 hours ago 0 replies      
Is this "is SHA-3 slow?" or is this "Look our new Keccak-based variant!"?
ComputerGuru 10 hours ago 3 replies      
Speaking of hashing quickly, I actually just posted an article about hardware-assisted SHA calculations and their support (or lack thereof) in mainstream processors, if anyone is interested.


kerkeslager 2 hours ago 2 replies      
Anyone who thinks that Hacker News comments are a great source of information should consider the number of top-level commenters on this thread who don't know the difference between fast collision-resistant hashes and password hashes, but who decided to post as if they knew what they were talking about.
Sami_Lehtinen 10 hours ago 0 replies      
There are alternatives like BLAKE2. https://blake2.net/
sigmar 9 hours ago 1 reply      
Summary of the post: SHA-3 is slow, but check out these other recently proposed parallelized variants of Keccak, they could be much faster.
baby 8 hours ago 1 reply      
There is an RFC being drafted on K12 here: https://tools.ietf.org/html/draft-viguier-kangarootwelve-00
annnnd 6 hours ago 4 replies      
Genuinly curious: isn't the whole point of cryptographic hash functions that they are slow by design, so that they can't be made faster?
mtgx 8 hours ago 2 replies      
I think it's crazy that we're considering "settling" for a 128-bit hash function that we won't even start to adopt until 5-10 years from now, when Google is already planning to announce its 49-qubit quantum computer by the end of this year. It's also very likely that quantum computers will scale at a "Moore's Law" of sorts rate (doubling the qubits every 2 years or so, if not even faster in the first few years). We've seen that with D-Wave already.


I feel the same about all of those "faster" (read: weaker) IoT-optimized crypto algorithms that are being proposed right now for standardization.

What we should be talking about now is quantum-resistant algorithms that are likely to be even slower than SHA-3, but would at least protect communications against quantum computers. We need them soon, because they'll have to be deployed on 80%+ of the internet by 2030 or so, and we know how slow the internet ecosystem is at adopting a new protocol.

almostdigital 7 hours ago 1 reply      
Who's gonna be the first to launch KangarooCoin or KeccakCoin with a SHA-3 proof-of-work?
Tenoke 9 hours ago 3 replies      
A bit off-topic, but I am currently trying to recover ~$70000 worth of ether[1] from my presale wallet and the encryption used is indeed SHA-3.

I'm currently trying 21970881 passwords against the hash using fairly basic python code (ETA: 2-3 days), and can definitely use tips for speeding up the work, as Id likely need to try an order of magnitude or 2 more generated password in order to crack it.

1. https://etherchain.org/account/0x35f5860149e4bbc04b8ac5b272b...

codazoda 10 hours ago 6 replies      
I'm not sure if SHA-3 has a design goal of being fast. My understanding is that, if your using it for encryption, for example password storage, you want it to be slow. This helps with brute force attacks, particularly if someone gets ahold of your database full of password hashes and can unleash as many calculations as they want. Because the SHA family has typically been fast, they are considered bad candidates for passwords. My quick Google foo doesn't turn up the goals of the competition, but I'm guessing being fast might be a negative.
atemerev 3 hours ago 2 replies      
Slow security is good security.

It makes brute forcing more difficult, and promotes decentralization of infrastructure (slowness adds up only for large-scale deployments like Google and Facebook, and we want to depend less on centralized actors, not more).

If you need to choose between strength and performance, choose strength in nearly 100% of all cases. You make the job of NSA or GRU much harder, and promote offsetting security mechanisms to the edge, where they ought to be.

How the .NET Runtime Loads a Type mattwarren.org
149 points by matthewwarren  15 hours ago   35 comments top
bigdubs 11 hours ago 3 replies      
This (somewhat) complicated process is also why generics can be fully reified in the CLR.
How we got 1,500 GitHub stars by mixing time-tested technology with a fresh UI freecodecamp.com
66 points by romanhotsiy  7 hours ago   24 comments top 7
Cynddl 5 hours ago 5 replies      
Despite the title, the article raises an interesting point: why do we prefer new code, with less features, to old but robust projects.

> Unfortunately, we were affected by cognitive bias: old code is bad code. But the truth can be the opposite. The old code is battle-tested by thousands of users in hundreds of different projects. Most of the critical bugs have been fixed, the documentation is complete, there are tons of questions and answers on StackOverflow and Quora.

I'm quite guilty here; I've realized that I always check the date of the last commit on Github before testing a project. I feel like I do this more for small projects where documentation and use cases might be missing, and for which I'm expecting help from the community.

crisopolis 2 hours ago 0 replies      
Also their user onboarding literally forces a unknowing user to create a GitHub account then star their specific repository.

Making GitHub stars ultimately meaningless, if they ever meaned anything.

nerdponx 1 hour ago 0 replies      
But after looking at the source code we found a fatal flaw in this tool: it used Graphviza decades old tool written in plain C and compiled to unreadable JavaScript using Emscripten.

Are the "decades old" and "plain C" aspects supposed to be bad things? It seems like the real problem is "compiled to unreadable JavaScript". Graphviz is a great example of "if it ain't broke, don't fix it".

nsmith7979 41 minutes ago 0 replies      
Hey, I'm the author of the "GraphQL Visualizer" tool (http://nathanrandal.com/graphql-visualizer/) that they mention in the article as inspiration for their project.

This new project is quite nice and definitely a step up from mine.

Also wanted to mention that there is a CLI version of the visualizer at https://github.com/sheerun/graphqlviz for when you want to quickly visualize a GraphQL endpoint for documentation purposes, etc. Keep up the good work!

diggan 3 hours ago 1 reply      
Ignoring the fact that Github Stars are a aweful metric for anything else than vanity, "More bells and whistles" feels like a terrible advice and you should much more focus on a "easy-to-get-started" readme with just the minimal amount of beels and whistles to effectively communicate what your thing is really about.
lgas 4 hours ago 1 reply      
How much do github stars sell for these days?
amelius 1 hour ago 0 replies      
Or how about simply not succumbing to the latest UI trends, which will be different next year anyway?
Copysets and Chainsets: A Better Way to Replicate (2014) hackingdistributed.com
29 points by nodivbyzero  14 hours ago   2 comments top
GauntletWizard 9 hours ago 1 reply      
Precisely where they went wrong:

 In practice, the speed of recovery is typically bottlenecked by the incoming bandwidth of the recovering server, which is easily exceeded by the outgoing read bandwidth of the other servers, so this limitation is typically not a big deal in practice.
If you're recovering to one server, you're going to have a bad time. With random distribution, you recover to every server, equally, over a very short period of time. The tradeoff is that you'll have a lot of churn, as temporary failures cause a lot of data to be rereplicated, and then extra copies deleted as the come back online. On the other hand, this helps balance your utilization and load.

The actual insight is that you want failure domain anti-affinity; That is, if you have 1000 servers on 50 network switches, you want your replica selection algorithm to use not three different machines at random, but three different switches at random. If you have three different AZ's, for each copy, put one replica in each of the three. Copysets can provide this, but as stated in the article, they're much more likely to give you Achilles heels - A typical failure won't hurt, won't have any unavailability - But the wrong one, and you go down hard, with N% dataloss rather than thousandths of a percent dataloss.

In short - Failures happen. Recovering from them is what matters, not convincing yourself that they can't happen.

Twitter, but for Math, with Toots ams.org
159 points by aqsalose  18 hours ago   28 comments top 7
tonmoy 10 hours ago 0 replies      
In case you didn't get the Pi * z^2 * a joke (like me), the volume of a pizza with radius z and thickness a is = pi * z * z * a

Edit: formatting

j2kun 15 hours ago 0 replies      
I joined a few days ago, it's been great so far :) https://mathstodon.xyz/@j2kun
badosu 10 hours ago 0 replies      
Since I see people annoyed in the comments by the lack of a timeline view: you can see the public timeline for this instance at [0], or fill mathstodon.xyz on [1].

[0] - http://www.unmung.com/mastoview?url=mathstodon.xyz&view=loca...

[1] - https://m6n-view.hnle.tk/

farkob 7 hours ago 1 reply      
There is something inherently wrong with this I think. This trend with knowledge, like as any commodity, getting cut down into smallest intelligible or consumable piece and delivered that way is not suitable for disciplines like math. Sure twitter is good for memes, one liners, little nuggets of knowledge or headlines, and sometimes even tweet chains but math requires more context, more rigor, more attention by the consumer. Which is the opposite of Twitter.

Sure it can be used for link sharing, but in my opinion reddit offers a better kind of model for this with its structured comments section.

420dao 11 hours ago 0 replies      
put a math problem in a box and put it behind a login. this is more annoying than anything.
badosu 15 hours ago 1 reply      
They might find this work useful [0]

[0] - https://github.com/tootsuite/mastodon/pull/2404

ue_ 16 hours ago 3 replies      
Mastodon is a really great piece of software, at least from the point of view from a user like me, and I really wish more people would use it. There's so much content out there, so many things to reply to and see, from different corners of the fediverse.
       cached 16 June 2017 16:02:01 GMT