> Because Go has so little magic, I think this was easier than it would have been in other languages. You dont have the magic that other languages have that can make seemingly simple lines of code have unexpected functionality. You never have to ask how does this work?, because its just plain old Go code.
That lack of magic and his comparison to C# sounds like a really good mix.
Thanks for blogging about your work on juju! Despite Go already being five years old, many of the patterns around building large applications are only emerging now.
I mean, you don't have to implement all the interfaces explicitly, you just have to get your structure right and be done with it.
Am I missing something?
But I wonder how it really scales on a large code base like this? Some of the best projects I've worked on leverage usability more effectively to create a sort of vocabulary. They're far more concise, there is rarely more than one source of truth, they're far easier to change and improve. Does this hold true for 540,000 lines of go code?
540,000 lines of Go code
65,000 lines of comments
So on average only 170 lines per file, including 17 lines of comments.
Are this normal ratios ?
It's written with knocking down a very specific set of straw men in mind, but rather carefully avoids coming anywhere close to addressing the legitimate criticisms of Go as a language. One of the things that's most irritating about Go enthusiasts is the way they try to close ranks on legitimate critique and reframe their language's warts as "simplicity".
Also: 20 bonus blub points for pulling the old 'I don't need generics, therefore NOBODY does' gambit.
In logistics alone: hundreds of warehouses in a dozen countries; hundreds of delivery stations, a fleet of drivers (independent contractors all) in the tens or hundreds of thousands; a fleet of airplanes (and an airport I hear?); and I read an article a while back saying they were buying a big ol' container ship because why not? Oh and the drones of course, whenever they get off the ground (eh? see what I did there?)
Then you add on AWS, Kindle, the prototyped no-cash-register stores, and whatever else they've got cooking that no one knows about. This company is my best bet for "Weyland-Yutani" from the "Alien" series.
Which brings up the main issue I have for to-the-door delivery: the packaging. The insane amount of boxes inside of boxes really makes me feel guilty. There was a project at Amazon in 2008 to have consumer friendly, "frustration free"  packaging, as most packages are optimized to stand out on store shelves and prevent theft. (Of course that makes them bulky and hard to open. With posed toys that secured with wire and screws, it's even more maddening.)
Where has that effort gone? Did it fall out of favor with manufacturers and / or consumers?
I would love to have Amazon drop off my order in as little packaging as possible and even collect & reuse special durable bottles and such for frequently used items like laundry detergent.
This seems like an opportunity as they increasingly build local warehouses and take ownership of the supply chain and delivery logistics.
I've been a customer since the late 90s and if they don't get this fixed, I'm going to start looking for another e-tailer to use.
I was okay with missing out on fast free shipping unless I paid $100/year, because I could still get slow free shipping. But now that I don't get the best price on a given product without Prime, shopping there feels like a bad value. The competition competes well on price now without extracting an annual fee.
As of a few weeks ago my Amazon rewards card gives me lower rewards than if I paid for Prime, so I'm looking for a new card. I'd chosen a card without an annual fee for a reason.
For the first 15 years Amazon trained me to buy online from one place. But in the last 5 it trained me off of that behavior. I have the added motivation to reject Amazon because of its treatment of employees and its business practices, but as just a customer and not an activist it's become a bad deal anyway.
Given that the general levels of debt are approaching 2008 levels , is this something tobe worried about in general, or is it irrelevant? The main reason for asking is that if largepercentages of Amazon's growth can be attributed to a rise in consumer debt, then wouldn't Amazon be disproportionally damaged by another recession and/or some sort ofdebt bubble burst?
For its employees I hear only bad things, like that Amazon won't talk to the union in Germany or what The New York Times wrote a couple of years ago.
I'm so conflicted about this company. But it matters. Both sides matter.
It's like, the laws of gravity doesn't apply to American tech firms but ruthlessly expected from the rest.
It's only a matter of time before Walmart kills Amazon. Walmart has enough cash to copy Amazon and steal their market share. The unchallenged status of Amazon will signal Walmart to enter it's turf and steal away the competition.
Whether I buy a box of condoms on Amazon or Walmart need not matter. Whoever gets it to me quickly and cheapest wins.
Short AMZN and long WMT
Didn't sound all that wild. Sounds like they figured it out pretty quickly just by looking at a hex dump.
Going from 4K to 8K for a 32" monitor may seem like a small improvement, but it is a subtle sensory improvement that just makes using a computer more pleasant. Until displays reach 1/60th of an arc minute from standard viewing distances (roughly the discerning power of the human eye under assumptions on contrast), I will always want higher resolution.
Other than resolution improvements, it would be nice if someone would attempt an HDR light field display. This would ultimately lead to a monitor that is indistinguishable from a window.
My current 16:9 24" 4k display in portrait is too tall---have to move my head up and down a lot, and too skinny---hard to put two windows side by side comfortably. Have to fiddle with overlapping windows a lot.In landscape it would be too short, and I rarely watch movies on my desktop. One movie trailer a quarter approximately, so optimizing for that use case is absurd.
I would prefer a shorter 16:10 instead, and was happy with the 22" 1920x1200 that was replaced, though I love the increased resolution of the new monitor.
This is tangential to this thread, but can anyone in here explain the state of the art in eye tracking? Actually rendering 16k quality for my peripheral vision seems insane to me, so I'm really interested in the barriers between today's tech and a good foveated headset.
Frankly, 8k at 31.5" is kinda pointless unless you have eyes of an eagle or are working glued to your monitor. As a tech demo from Dell, it's cool!
This seems like it will be possible one day.
At a normal viewing distance a 100dpi monitor is already decent. A UHD monitor is just great. You have to get very close to see individual pixels. I'd say doing AA is no longer required even then.
8K seems overkill for most purposes. Sure, there is a niche that can take advantage of it, but I don't see advantages for the mass market.
Same as with SACD, CD tech is simply good enough for pretty much everyone so SACD never took off.
I write this despite being a high dpi junkie, I bought a ViewSonic VP2290b (IBM T221 clone, 3840x2400 22") back in 2006 and dealt with a huge hassle of 4 DPI inputs for years.
My predictions aren't holding up . I feel too many customers are satisfied with small form-factor and/or are tolerant of using multiple displays and suffering the inconvenience of bezels.
I have never understood what's the point of investing in such an expensive and new tech for videos for YouTube, most of the tech channel videos on YouTube will be non relevant really fast so it's not like you are future proofing...in two years from now still very few people will have 8k screens, and cameras will cost at least 50% less. Storing 8k videos will increase storage costs and processing bandwidth too.
I wish Dell would produce 8K monitors in a much larger format, like 48".
I appreciate the author saying 'maybe' because I can't, off the top of my head, understand why 16K would add any benefit over 8K... Aren't we at 'retina' with 4/8K anyway?
EDIT: at a 32" resolution...
If Covert Action had believable, mimetic, tantalizing or at least interesting plots to foil, I submit that it could have been a tremendously compelling game, without changing anything else about it. Instead, though, its got this painfully artificial box of whirling gears.
I suppose in the 80s, given the limitations of the box, there was a much larger emphasis on "narratology" in games, but this criticism might hold for most failed games.
Edit: Never seen this site before. Who is this guy? Now I am going down a rabbit hole...
I'm glad it exists, and for anyone who wants to teach kids about very basic crypto (i.e. letter replacement, removing spaces, etc.), I suggest the "training" for the Crypto skill; it's also good to go and brush up on your own skills. Similarly, the wiretapping is a genuinely interesting logic puzzle which can be used to teach the concept of physically representing 1/0 and logic gates (and how the result may not always be what you expect).
As far as "The Covert Action Rule", I guess the question is how you approach the game. If you look at it in terms of "older" games (i.e. Zork et al), having a notepad beside the computer to take notes seemed the most logical thing in the world to me. In many cases while playing games with complex storylines and puzzle-solving, it still does. I never found the tactical sections making me "lose track" of why I was in a building - either I was there to grab an Agent or I was there to get information to go and do so that I hadn't gotten by wiretapping or crypto.
I'd buy that. Purer, but not better. Procedural generation, in theory, has a lot of advantages, but in practice, in many areas, it hasn't matched the old school approach.
Has anyone played both Covert Action and Invisible, Inc for comparison?
Someone has to enjoy the first play, before they'll replay. You don't know how to make one good game yet, never mind a factory that makes good games.
Hylogen is an EDSL for livecoding shaders in Haskell 
My weapon of choice is Extempore by Andrew Sorensen . Dual languages in one system available for blistering speed in livecoded graphics and sound/music.
 https://github.com/sleexyz/hylogen  http://extempore.moso.com.au/
Sonic Pi: http://sonic-pi.net/On GH: https://github.com/samaaron/sonic-piAerodaynamic: https://www.youtube.com/watch?v=cydH_JAgSfg
I would be inclined to use a language that had more signal processing built in, ie Mathematica:
These lights were 72,000 watts each, with a total of around three-quarters of a million watts of lights in the crash hall. (Almost as as much power as the German sun installation.) High speed cameras need a lot of light.
I've stood in front of one of the lights in protective gear - but the heat still just goes through your body.
The first time the IIHS lights were turned on it blew a power substation 20 miles away.
This one does not work inside a vacuum chamber but manages 10,000 times the normal solar radiation with the temperature reaching 3,000C. Wow!
Humans calling the Sun "finicky" is a bit like ants calling an aircraft carrier "slow and cramped", or something.
(three magical letter holder here)
There was a huge retrospective exhibition of Yoko Ono's work at Musee d'Art Contemporain in Lyon, France in 2016. Included were a mixture of interactive/participatory elements (ladders you were meant to climb, a room filled with hammers and nails that you were supposed to contribute to by banging a nail into whatever you wanted) and other obviously not interactive elements (videos, paintings, things protected by glass).
And I remember, in the same room as the ladders you could climb and combat helmets suspended from the ceiling by ropes that were swinging around because people were walking through them, was a sheet of either canvas or paper mounted on the wall, with another piece of either canvas or paper suspended about an inch in front of it by some string, with some holes cut into the front sheet so that you could just see that there was something drawn or written on the hidden layer.
Well, in a room filled with people talking and laughing and climbing ladders and pushing helmets around, I went to peek behind the curtain and lifted up the front layer to take a look at the one behind.
A custodian ran at me and told me to step away. I still don't know if I damaged it or participated in it.
He remembers a charcoled doorframe, that had to be transported although it basically could come apart any second once moved. They have vibration reducing special boxes, with the same climate protection as humidors have it.Also there are titanic insurance fees at work, to move art.And sometimes, somebody in some state run museum, is forced to take the cheapest option available.One of those haulers venturing into art, transported a artwork by a Chinese artists (do not know the name), basically very long paperrolls with Chinese letters on them to be hung from a halls ceiling. Those rolls they come in cardboard boxes, sealed with tape- and the poor fellow, takes a cutter, and systematically, cuts through the tape that seals all the boxes.Each box a artwork, insured for 25.000 - sliced.
I see always people using flash in museums. I assume it's because they don't know how to turn it off.
Also, there's this project based on the VPP dataplane technology, which could help you achieve higher bitrates. Personally I haven't managed to play with it yet: https://trex-tgn.cisco.com/
I just discovered that guy's (Rick Beato's) channel the other day. I normally find music theory very dry, but even my kids found that video interesting.
Why was this posted, It's a generic packet generator?
Also, one of the meanings of my first name "Mani" is literally "Bell".
Source: https://en.wikipedia.org/wiki/Tamil_scriptAlso: I'm Tamil.
Learning more about this is actually kind of hard. Unfortunately, there seems to be a lot of pseudoarchaeology concerning pre-Columbian transoceanic contact.
It makes me ponder whether the owner of the ship was himself cross cultural.
The Arab presence in Southern India actually predates the advent of Islam, so even it's possible (though not likely) that this artifact hails from before that time.
EDIT: TFA notes that the script is datable to 500 years old so it is probably not pre-Islamic.
I like language, and this inscription makes me wonder. Why write on a bell that it's a bell? I'd be less surprised if the bell had said "Muhayideen Bakshs ship".
I mean, I get that school buildings say "School" because otherwise it's really just a building. But a bell? Isn't that a bit like writing "Headmaster's Office Door" on the headmaster's office door?
I wonder whether maybe it was just an artsy kind of joke. A bit like how in import stores you can buys forks that say "FORK" on the handle.
I now imagine the crew on that ship looking up to the bell every once in a while, grinning, thinking "that Baksh fella is a tough one but at least he has some sense of humor".
"Australia experienced a wave of migration from India about 4,000 years ago, a genetic study suggests."
It reads phonetically as "MugayatheenPak Udaya Kappal Mani".The translation is spot on.
Source: I'm a native speaker from Tamil Nadu.
Nearer to me, the Tiwi islands just off the coast of North Australia have unearthed jade figurines and artifacts that seem to originate from China or another Asian country, which date back to before the time of Captain Cook.
Update: Article on early Chinese explorers reaching Australia a long time before Dutch or English explorers - http://www.theage.com.au/articles/2002/11/24/1037697982893.h...
You do understand homo sapiens did not evolve separately on these different continents/lands right?
The actual discovery was when the first homo sapien settled in US, AU, NZ etc.... Of course some cultures managed to travel to these lands long before our history records tells us. Native Americans are not a separate species of homo sapiens, they did not evolve there separately.
Basically there's genetic (Y chromosome) and linguistic evidence suggesting an infusion of Indian immigration (~5000 years ago so maybe a distinct event from this). One plausible explanation is that some Dravidian seafarers crash-landed in Australia and got absorbed by the local population. From their perspective it probably would have felt like sinking into savagery.
Voynich manuscript Undeciphered book from the 15th century
It does not 'feel' heavyweight or that it gets in my way too much, but the contrary is true. Of course, it has it`s quirks, as every larger lib have, but it`s pluses outweigh minuses by far for me.
Preferences aside, does Angular make you less productive than other options? Do you feel that you fight the framework? Can you finish non trivial frontend apps, involving 5+ team members, 'better', with cleaner code and much faster with other options?
This are just honest questions. I wanted to start some pet project in angular2 soon, but would listen to alternatives, maybe it is a right moment to try some of them.
On first note, the codegen size decreased dramatically for AOT compiled builds - we went from ~600 KB vendor + app minified and gzipped (not counting 50 KB of polyfills) to ~400 KB. This is huge, and the boot speed feels even faster on first user load of the page!
Thanks to the Angular team for this fantastic work!
Main change, as noted, is the new View Engine. The design doc is worth a read if you're interested in front-end at all.
Happy to answer any questions!
I'm sure the Angular team are great people, and they're clearly talented devs... but stay away from Angular.
And I say this as someone with a lot of experience with Angular 1, Angular 2.
Trust me, go for libraries over frameworks every time. Redux, React, Immutable, Sagas, and reSelect. That's the future of web app development.
Angular provides consistent structure to your app. With angular-cli, consistent interface to front-end development. Having ported it from Ember, IMO the cli interface should be standard across all front-end frameworks.
I see it as microkernel like framework for front-end development. In Java land, we have OSGi standard for modular software component development, angular has sort-of similar design.
The only confusing part is NgModule, the team should rename angular modules as NgBundle - which consists of native TS modules.
I read somewhere, angular team is working on material widget library, please make style-less components with theming support.
Angular 1 to Angular 2 made me drop angular entirely, that happened less than 6 months ago I believe.
Just reading Angular 4 gives me the creeps.
Is 4 widely different from 2? How am I suppose to know if you using semver and already have a really bad history?
And it's almost always a mistake to go SPA first. Using a django or a rails lets you get the basics and data flow nailed down early on. Get into a framework too early and have a need to change something? Have fun explaining to your manager/client how costly it is to do a "simple" modification to a JS app when you have to throw the state you built it upon out the door.
What I want is a system tightly coupled into a server-side framework like a Django or a Rails that degrades gracefully and I only have to program the interactivity one time. Something that'd plop right into the asset pipeline/django compressor so I don't have to go outside of the framework to build.
Hundreds of hours of my life have been spent chasing this dream of sharing server side code with client side JS frameworks. That's what I need.
Meteor didn't do it for me. As for rendr, I've done stuff better with backbone/express in-house. As of 2017, I get my best bang for the buck using django and pjax. No joking, I went from full DRF + Backbone Marionette -> to plain old jquery and pjax and couldn't be happier.
All these new build tools (grunt, gulp, webpack... come on), ES versions (I was ok with ES5). None of these things are helping me ship stuff ahead of / on time and correctly. They're creating an even larger gap between the server side data, logic and templates and the JS interactivity.
If anyone is listening, I'd love to have a well-supported opinionated distribution of django or rails that just renders forms, tables, etc. with angular/react/etc. and degrades gracefully.
I'll say personally I've never been a fan of Angular, but I think if you want Java/J2EE-ish all-encompassing component model and decorator-/annotation-based GUIs, it certainly is a very strong contender (though kindof the thermonuclear option and absurdly complex IMHO, at least if you have some prior web development experience). I think Google's track record wrt. long-term maintainance isn't half bad really (GWT and closure tools have been around for a long time).
That said, I've recently talked to recruiters, and was told Angular has already peaked as the go-to framework for enterprise MVC web apps, and is replaced by React and others (and I'm assuming Angular wasn't all that much used outside that demographic because of the heavy setup and on boarding/buy-in).
An open question for me is what about TypeScript, eg. since Angular has been a major driver/user behind its type-heavy approach, will it suffer along with Angular?
What's going on with versions?
My criteria to adopting a framework:
* Makes me more productive
* Allow me to ship apps / solutions better and faster
* Minimal time doing some head scratching on how to implement stuff
* Integration with other frameworks
* Uncomplicated setup and build process
* Community support (hey, we can't possibly know every inch of a framework unless we are the author)
in 2017 I'm curious why I'd pick angular over vue for a web-based project, if anyone has insight. not over "x js framework" -- over vue specifically. vue 2 is pretty much exactly what I wanted angular 2 to be.
Install the latest version of the CLI:npm install -g @angular/cli
And then run:ng new project-name --ng4
The "--ng4" flag is currently required, as it doesn't yet install ng4 by default.
I am seeing : platform-browser / platform-server namespace. Can we expect platform-android, platform-ios, platform-jvm ?
Looking at the angular compiler pipeline, with AST this should be possible ?
Sorry angular team, it just mean you did poor job at first.
How does Angular make my life easier or better in some way?
I currently use Python and Pyramid as a framework. Mako templates. And SQLAlchemy for database interactions.
How is any of this really better?
Anyone choosing monolithic over microframeworks/libraries will regret it over time, that time may be as quick as 6 months, right around launch and switching to maintenance. Pour some out for the poor bastards that have to support legacy versions of these frameworks.
Looks like cool features, but I have a hard time envisioning on how they all work together.
So making the hypothesis of good faith from the submitter it means 50% of historic users of angular dropped it since 2015. Most of them anyway did not fully adopted it.
If a technology is being dropped after 2 years by their early motivated adopters, maybe there is a smell?
I did publish the Firefox add-on (desktop only) just so I could avoid having to use web-ext or temporarily install it every time I run FF. It's a complete hassle to set up and configure at the moment. https://addons.mozilla.org/en-US/firefox/addon/ssure/?src=se...
Edit: Source is here https://addons.mozilla.org/en-US/firefox/files/browse/601156... in case you want to try and not worry about it doing something shady.
The equivalent of a pop-up newsletter modal is somebody on the street PULLING you aside, standing directly in front of you and preventing you from going any further until you answer their question. All without bothering to observe what you were doing beforehand. Your choice then is to step back the way you came to avoid the creepy sidewalk-blocking people. Ridiculous, creepy and unacceptable in real life but essentially exactly how web sites treat their visitors.
Automatically with a browser plugin, with an address that will even accept your mails. Unfortunately no human will read them in the end, but I'm sure your metrics will be great. I might even accept a cookie so you know I'm already subscribed to your great newsletter.
Now if everybody were to do that...
So please stop beeping at me and popping up a chat modal.
"sticking a big ole pop-up in their face can be one of the most effective ways to jolt their attention & grab their email for a return visit." Peep Laja. https://conversionxl.com/popup-defense/?hvid=2EcGFw
1. The crowdfunding sites themselves maintain HUGE newsletter lists and use very advanced analytics to determine what to place in those newsletters.
2. For campaigners, the size and activity of your email list is a huge factor in determining your campaign success. Just like this web tool https://www.thunderclap.it/about sending a direct email blast to a good list can mean the difference of a successful hard launch and campaign, or a lackluster or failed campaign. The email lists of the sites themselves which feature several campaigns, are hugely influential on campaign success, and in my experience has at least once lead to the production of 4x our total raise goal in a single platform newsletter feature of our campaign.
Sometimes people do want to be notified. Newsletters are something of a different issue, but the case above seems like a newsletter to me. Especially because we used our first campaign backers + second campaign + interest landing pages and social media gathering email campaigns to continually send emails about new campaigns and products.
Essentially therefore I'm arguing, the ability to gather a quality, targeted email list and generate a recurring newsletter without a 10%+ attrition rate  is both difficult and valuable.
 CANSPAM compliance requires unsubscribe link, my personal interpretation is 1-click unsubscribe should be the rule, no loading email setting pages behind login walls. Good design is honest. Crowdfunding requiring physical good production in quantity is very difficult for the uninitiated. And then it remains difficult, time consumer over time, and requires constant attention. This is essentially scaling issues but in the physical world. So many of the failed to deliver crowdfunded projects are not so much dishonest as naive, but also consider Jobs' thoughts on the subject
> great artists ship
though Dieter Rams (most famous living Industrial Designer) says
> designers are not fine artists who we are often confused for
Extension would delete that DOM subtree rightaway. + some kind of cloud harvest from users reporting false-positives.
In general, I agree. My reaction to most modals is to simply close the tab. Often it's halfway through an article. I can't be bothered to finish it if I'm being interrupted rudely.
I don't know what to do about this situation other than to write my own paste-in package for newsletter signups, which I don't really have time for. I guess the best thing I can do is announce: if your newsletter prompt doesn't cover the main content of the page, I'm much more likely to subscribe (~20%) than if it's a modal + windowshade (0%).
Sure there is:
1. Alt-F4 2. Ctrl-Alt-Del
They propose a gradual distrust of existing certificates by reducing the 'maximum age' of the certificates with each release of Chrome.
EV certificates are proposed to have their EV indicators stripped immediately until Symantec, for one year, demonstrates sustained compliance.
In case you missed it at the bottom:
> From Mozilla Firefoxs Telemetry, we know that Symantec issued certificates are responsible for 42% of certificate validations
For the vast majority of users that's probably just fine, but I would have thought that there'd be a browser or extension or something that allows security-conscious power users more fine-grained control over this by now.
For example, I could subscribe to changes in CA trust levels from every major browser vendor, and if they don't agree my browser could show me a warning with an explanation.
Or I could subscribe to feeds from other entities I trust, like the EFF. Or my security-conscious friends.
Or if I decide I have lower trust in certificates issued by governmental CAs, or CAs in certain regions, I could mark them as lower trust.
Basically a web of trust for CAs.
(Archive link: http://archive.is/Cq9VO )
Really interesting reading!
sorry if I missed this...
Chrome 59 (Apr 13, 2017) +1023 days: 2020-01-31
Chrome 60 (May 25th, 2017) +837 days: 2019-09-09
Chrome 61 (Jul 20th, 2017) +651 days: 2019-05-02
Chrome 62 (Aug 31st, 2017) +465 days: 2018-12-09
Chrome 63 (Oct 12th, 2017) +279 days: 2018-07-18
>While this list may need to be updated for some recently created roots, https://chromium.googlesource.com/chromium/src/+/master/net/... may accurately capture the state of impact
Damn. There goes my certificate (Rapidssl). Anybody know what are the remaining, trustworthy certificate issuers ?
No we cannot use LetsEncrypt for convenience reasons (we bake our certificate pub key in many places)
When I read that something like this popped up in my head:
"Google is using the nuclear option on Symantec. Neat!"
This seems to be indicative of the general indication that Chrome wants to head in anyway.
 https://cabforum.org/pipermail/public/2017-February/009746.h... - there was a more explicit post elsewhere but I can't find it in the archives right now
is this for all the certs or just symantec certs?
Don't fail the connection, so people will still be able to use the site, but don't present it as secure.
This would be a stronger action than treating EV certs as non-EV, which only a few geeks will notice. Or reducing the maximum age of certificates.
But that's almost 2 years old. Have there been any more recent incidents that I'm unaware of?
> Combined with the gradual move to certificates with shorter lifespans anyway (as a way of coping with problems like this, and with the difficulty of certificate revocation generally), automation is a necessity going forward.
Interesting. What are the effects of automation? E.g. is it possible to automate the update of EV certs? Is this opinion fueled by the idea that websites should be hosted in the public cloud? What is OWASP's stance?
Where can I find more on pro & con automated cert renewal? Around the time when Letsencrypt was introduced, there must have been someone who wrote an informed article/blog on automated cert renewal.
They're big enough to afford the higher end law firms likely needed too. :(
32 million = 0.1%32 000 million SSL certs? = 100%
One of the arguments that directly comes to mind when I look at dragapp's screenshots is that parsing messages to find out what the underlying task is just takes too much time. Look at the image and you'll see that there is no way to see what part of "Chrome extension for Gmail" is in to-do, progress or complete. You have to read each e-mail to find out.
Backup screenshot: http://imgur.com/a/VtyyO
When you think about it, gmail is just one input into your list of things to do for the day - so you could imagine other integrations like Github PRs that end up there. Anyway, I'm glad people are still working on this problem, I consider it very much unsolved.
Perhaps move to a host that helps you with that if you are unsure how to deploy HTTPS?
and selected all to delete.
no more annoying count of unread emails for years ago.
Email != ToDo List
GTD has been working for me for 10 years now, with simple lists. I think a part of it's strength is that the GTD-framework includes a method to "start over again" after your system gets inevitably cluttered.
That doesn't make an iPhone 7 impregnable, but it should inform any analysis you do of stories about phones being tampered with "starting in 2008"; that's a little like talking about SMTP server security "starting in 1993".
Someone might have sat on a copy for years before leaking.
Edit: Quick scan shows there are some docs with dates in 2013, 2014, 2015. So at least some of it is fairly recent. No real way to tell, though, if it was all pulled at once, assembled over time, etc.
"fixed" probably isn't the right word.
Regardless of the fact that this is a patched, nearly decade-old exploit, they're trying to make a scene rather than go through ethical channels.
Direct GitHub link: https://github.com/google/seq2seq
When you feel yourself getting all rigid and tense in the muscles, say, because you read an article about how you're doing it wrong or that your favourite libraries are dead-ends, just take a deep breath and patiently allow yourself to return to your gelatinous form.
Now I know what you're thinking, "that's good and all, but I'll just slowly become an obsolete blob of goo in an over-priced, surprisingly uncomfortable, but good looking office chair. I like money, but at my company they don't pay the non-performing goo-balls." Which is an understandable concern, but before we address it, notice how your butt no-longer feels half sore, half numb when in goo form, and how nice that kind of is. Ever wonder what that third lever under your chair does? Now's a perfect time to find out!
As long as you accept that you're always going to be doing it wrong, that there's always a newer library, and that your code will never scale infinitely on the first try, you'll find that you can succeed and remain gelatinous. Pick a stack then put on the blinders until its time to refactor/rebuild for the next order of magnitude of scaling, or the next project.
The logical conclusion then is to wait a year and ignore everything this article says. Saves a lot of time. :)
For anyone interested in learning React, here's my standard advice:
You should start out by reading through the official React docs and tutorial at https://facebook.github.io/react/, and use the official Create-React-App tool ( https://github.com/facebookincubator/create-react-app ) for setting up projects. It creates a project with a solid build setup, with no configuration needed on your part.
Past that, I keep a big list of links to high-quality tutorials and articles on React, Redux, and related topics, at https://github.com/markerikson/react-redux-links . Specifically intended to be a great starting point for anyone trying to learn the ecosystem, as well as a solid source of good info on more advanced topics.
OK, so everything's changing all the time, everything you Google will be out of date, but it's worth it because... why, exactly?
Node doesn't block on IO. OK, cool. But the Erlang VM doesn't block on IO, either, AND it's not single threaded, so you can actually do parallel computation if you need to.
JS has a lot of raw speed these days. OK, cool. But raw speed is generally less important than good algorithms and parallelism, and if we really need raw speed in other languages, we can generally call out to C or something.
Rust has a unique value proposition: safety and speed. Erlang/Elixir have one: supervision and easy parallelism. Ruby has one: well-worn tools for web development.
In a typical React app (+TypeScript), I have atleast 40 dependencies in package.json (frontend-only). I know exactly what each dependency is required for. And each time I have a new app, I create that package.json one line at a time - just to make sure I don't add unnecessary dependencies.
But for a JS newbie, knowing what dependency is what and why - is pretty overwhelming, tedious and uncomfortable. This is assuming they already spent time learning JS and the frameworks in the first place. Its easier to give up that important learning when you have to deliver "just a landing page in Angular" and answer to managers. After few iterations, the result is a disaster that is difficult to maintain, buggy and much more difficult to add features. Ultimately that adds to the development/maintenance cost.
I know why the JS tooling is complex, but sometimes I wish JS projects were as simple (and controlled) as creating C# windows apps using VS - where the tooling is mostly hidden
Does not bode well. Has nothing to do with language though.
The JS everywhere is so much like the "only tool you have is a hammer, so every problem looks like a nail" thing, it's amazing.
Creating a simple, secure, extensible middle-tier is a solved problem and is not in need of JS trying to solve it in a much more obtuse way. I've created many myself in everything from Delphi, PHP, C#, Groovy, to Java and I would never pick JS for that layer.
And a final thought, PHP used to get tons of bad press for being messy, etc, etc. But this JS stuff takes that mess to a whole new level. Perhaps PHP devs moved to node/js so they could make a mess and everyone would still think they are the cool kids?
At some point you have to ask yourself how much of this is good engineering (does it help the end user?) and how much it's just having fun and impressing fellow engineers. Is the complicated build process worthwhile just to obtain a nicer syntax (for the current definition of "nicer")?
The whole thing looks like to be designed toward make big and long-term maintained website, not toward helping you to hack up throw away site quickly.
Stuff like gulp, after I installed it globally, I still have to install it per project, which makes little sense to me.
It used to be just "compass init; compass watch", and start write stuff, uglify JS before deploy, then done.
I still hates JS, I don't get why a language once hated by everyone, now is sexy again.PHP in contrast, have improved a lot since 5.3, and it actually quite fun to write these days.
Rediscovering the horrors of setting up a Java web app in the early 2000s, except now you configure it with json instead of xml.
In django i would just create a model, a form class and a template and that would handle the rendering of the form, form validation, and saving the data into a db. But with react you would have to create the components and also create a separate api that would handle the retrival and saving of data.
Are there any ancient developers out there building web apps faster with react/vuejs/angular than django/rails?
1. Python + functional programming in Python
Python is hardly a pure functional language, but it's lovely and simple and has all the core concepts including list comprehensions. This leads you on to...
If you want to find a pure functional solution to a Python problem, first search for the Haskell one and translate it. Then read Learn You a Haskell  which was the funniest programming book I ever read and almost, almost taught me about monads (I had it for a second, then tried to explain it in Python and all was lost)
Now you can relax cause the hard bit is done.
4. Work your way through the funfunfunction  videos, especially the functional playlist  and for added bonus he has videos where he works through the first few chapters of Learn You a Haskell.
Then you've got map, reduce, filter all completely under control. Now immutability makes more sense, arrow functions don't look so strange, promises are just friendly monads really and we all love those.
Now you've got Immutable.js , lodash, underscore all reasonable to understand.
React's moaning about state and pure functions makes reasonable sense.
5. Babel really isn't that hard. Just following the Meteor + React tutorial  got that all working without me really noticing. Then, holy moly you're all reacted up, with JSX and pure sweet smelling functions.
6. Follow some of Dan Abramov's excellent blog posts such as about getting eslint up and working in Sublime Text .
Yeah that's as far as I've got, but adding in Redux to this mix doesn't seem so scary, at least I understand the language now. Angular will just have to wait.
: http://learnyouahaskell.com/ : https://www.youtube.com/channel/UCO1cgjhGzsSYb1rsB4bFe4Q/ : https://www.youtube.com/playlist?list=PL0zVEGEvSaeEd9hlmCXrk5yUyqUag-n84 : https://www.youtube.com/watch?v=I7IdS-PbEgI : https://medium.com/@dan_abramov/lint-like-it-s-2015-6987d44c5b48 : https://www.meteor.com/tutorials/react/creating-an-app
...in order to replicate dispatchEvent and addEventListener?
To all the haters, there are very practical reasons I started using Node on the backend years ago. Trying to build a collaborative real-time web application with Twisted and whatever was a huge pain. With something like Node.js its much more practical because you get async and isomorphic (same codebase on client and server).
And now we are actually one or two trends removed from that initial wave of popularity. Plenty of the young guys who basically wrote the book on Node.js and created the most downloaded-ever npm modules moved on to Go or whatever years ago. And plenty of them decided to move on to Rust (or a few picked up Nim if they are smart).
I have to admit it sure does sound popular these days. But I still am under the impression (delusion?) that a dedicated backend implementation in a better-suited-language is what I really need. That said, I have little experience with web dev.
A lot of the times when technology is being created in a space where existing technologies already solve the problem, and even moresoe when these frameworks are an "extension" of such platforms, the NIH syndrome comes as a cause of being unfamiliar with how the original technology solves the problem. Rather than RTFM and internalizing that way of thinking, the libraries are rebuilt. Even worse, some get adopted by new developers that don't know better as being standard.
jQuery, motherfucker, I'm looking at you. HTML5 and CSS can go further than many expect, including myself.
If you just take your time, carefully go through tutorials and study a couple of open source projects you'll be fine. Especially if you already have years of development experience. You can do it!
Pretty opinionated on the tooling it uses, but good if you just want to say "Fuck it, I want to fork some boilerplate, write some code, and build/distribute it", then worry about the finer details later, and swap in specific tooling that you need later.
I'm kind of still waiting for someone to build, like, an Erlang (or Elixir) for the frontend. Maybe compile actual Erlang to it via emscripten.
Then code up a way to interface with the DOM, as well as model it in tests, and I could be a frontend developer again! (except for CSS... hrm)
I'd be happy to never look at JS again, frankly.
I think Elm currently comes closest to what I'm looking for
Pardon the shameless plug, but the whole project started as an easy way for moving from jQuery spaghetti to a fast data-driven virtual-dom view layer so seems fitting to post here.
So I feel like some of the concepts taught in the most recommended books such as Clean Code are only tangentially relevant and some times just plain irrelevant. On the other hand, massive books/material written for pure JS are really low quality compared to these classics (except some blog posts, but with these it's difficult to get a round education).
So the most difficult part for me was to find out which concepts were relevant for JS in high quality books and which were not, something that is not easy when you don't know these concepts. Then learn the things not mentioned that are important for a modern JS developer such as NPM, exporting modules, minimization+gzip+tree shaking, etc.
"I had to let go of doing it The Right Way from the get-go, and allow myself to fumble through using suboptimal or just plain amateur setups just to get comfortable with individual tools"
Hopefully 5 years from now, we'll all be looking back in disbelief at what we used to put up with to build web apps.
Simple browser = script src=xSimple node = index.js
What if you want some libraries included? specify your libs in package.json and npm install.
What if you want to publish your libs? Npm publish.
What if you want to have dependency tree bundled for browser as single script? Webpack
What if you want nice types, latest features and validation for larger projects ? Typescript loader in webpack
What if you want a nice state management library? Preact /react
There's a ton of crap and different ways of doing things but I bet fundamental things are here to remain which solve a large % of use cases.
I would say things are getting simpler in some manner. You don't really need lodash and jquery nowadays. The standard js in browsers has improved quite a bit.
I've been using React with JSX and maybe as little as React router too. Trying to get router added was so painful. NPM, Babel, we pack, grunt, gulp, etc and then you find you've been given instructions for something on an old react version and you need different tooling for a new one. Or the instructions give you something that's really only suitable in dev not production.
I actually gave up trying to add Flux and wrote my own implementation in vanilla JS which seems ludicrous to me.
FTFY: Zero Problems, Not-Tested Solutions
Why build complicated native applications when a thinly wrapped SPA in Electron will be indistinguishable to the customer? The only objection here is the engineer's ego and reversion to the familiar.
> It is tempting to say that the record low we are seeing this year is global warming finally catching up with Antarctica, Meier said. However, this might just be an extreme case of pushing the envelope of year-to-year variability. Well need to have several more years of data to be able to say there has been a significant change in the trend.
I saw Fox News manage to put a anti-global warming spin on it by only reporting that cold killed the cherry blossoms, leaving out the fact that they normally blossom in April.
What are we going to do about it?
I hear people say it's just too late to stop a catastrophe, but I find it hard to believe.
Concretely: Contributing to apache 2 software potentially grants universal licenses for relevant patents you hold to anyone that uses the apache 2 software. Courts have not decided how viral this is. (What if I start with apache 2 code, make substantial changes, and apply it in areas the patent holder objects to, for example?)
I'm not a fan of software patents, but sloppy retroactive relicensing like this will create all sorts of legal ambiguity for users and contributors.
(Also, as the openbsd thread points out, apache 2 is license incompatible with existing openssl forks and other downstream software)
Having helped out with many relicensing exercises at this point, i'm not sure why people are so worried.
The usual tact taken is:If you cannot affirmatively get consent, you remove the code and, if still needed, have someone not involved rewrite it.
I have yet to see anywhere in all of this that actually says they plan on doing anything different (IE theo's trolling seems to be based not on reality).I looked at the press releases, details, and mailing lists.What am i missing?
EDIT: Dammit, Squid is GPLv2.. It might make the situation worse due to the Patent clause.
So, potentially now.. GPLv2 programmes can't use OpenSSL. Nice.
I don't suppose the libressl folks could end up creating a 2-clause BSD implementation of the SSL/TLS stack that could divorce itself completely from its openssl origins? (Yes I know that's almost certainly impractical for such a large, complex and thorny code base and problem set, but it's a nice dream... maybe some cryptography researchers/companies/etc. looking to make a name the the industry could target and re-implement specific pieces/algorithms/etc...)
After all, this spells doomsday for the upcoming generations, so shouldn't it be the news that should be shown/covered almost everyday on the front page.
The people have the right to know that their children and grandchildren will suffer because of something that is going on right now. I guess, that majority of people, all over the world, are blissfully unaware of this scenario because this doesn't get the kind of attention in the MSM that it should. All they get served is dirty politics and gossip entertainment news.
Maybe people will force the policies to change if they get to know that this will happen.
It seems that most people today think that Terrorism is the main threat to our society, when in fact, Global Warming seems to be the real deal.
Let's say it was found that fifty years from now, an Asteroid would hit Earth. Would the people of Earth react in the same way as they are doing now?
Some summary points:
- Total amount of methane in the current atmosphere: ~5 gigatons
- Amount of carbon preserved as methane in the arctic shelf: estimated at 100s-1000s of gigatons
- Only 1% release would double the atmospheric burden of methane
- Not much effort is needed to destabilize this 1%
- The volume currently being released is estimated at 50 gigatons (it could be far more)
- 50 gigatons is 10x the methane content of the current atmosphere
- We are already at 2.5x pre-industrial level, there is a methane veil spreading southward from the arctic.
- Methane is 150x as powerful a greenhouse gas as CO2 when it is first released.
Here is a longer video for those who have the time: https://www.youtube.com/watch?v=FPdc75epOEw
If anything this article plays it down with words like eruption and venting. This seems super dangerous for people in the area. And dangerous in the climate change sense for the rest of us.
Scientists have been warning about this for decades. And we've done nothing. Not even to slow down!
I saw the best minds of my generation destroyed by madness...
Digitize all human knowledge and as much art/literature as you can gather (books, music, movies, shows, games, even porn and random YouTube videos and discussions on online forums :) Most of that work has already been done.
Store it on the most resilient (and simple/repairable) storage media you can,
Bundle it with devices that can read that data,
Along with instructions for building/reinventing such devices, and instructions on how to interpret that data (i.e. JPEG and other file formats :)
Also include a guide for translating the instructions. Assume that a future reader may not understand any of our current languages, or even be human at all.
Put it all in a silo as physically strong as you can build.
Make copies of the silo and bury one on each continent and in each ocean. Maybe even on the Moon?
Distribute markers and maps to each silo (and instructions for opening them) all over the world.
Let fate take its course.
All of this could be done by a few individuals and most of it won't even require a lot of money.
When the backwards people will see us all in our cheap to run, fast to accelerate and cheap to maintain electric pass them on the highway will they feel dumb buying gallons of gas to keep their guzzlers going.
Perhaps the way things are going they will become too plentiful to really do anything realistic about it.
- I assume the coasts are a nonstarter.
- Moving inland could be nice, but the ground would need to be pretty arable to restart civilization there. I'm guessing somewhere in Africa is a safe bet, given the lack of resource exploitation there, but then again, there's not much infrastructure at the moment.
Basically, either we make all the turnarounds necessary, on every front - policy, technology, engineering, culture - to beat this stuff by a huge margin, or it's all over. I don't see a middle ground of "things kind of suck for a while and then it's okay" happening. And that guides a lot of my thoughts on other topics: I want survivability, and of a form which preserves key freedoms. I believe massive reforms to the economic and political system are needed to do it. I believe our existing social network structures and cultural norms are insufficient to address this. There's a lot of room to change in all directions.
edit: nm, I saw that conference where they've shown Methane releasing from the submerged Siberian plane. We're fucked.
(The craters filled with water, hence the dark color. The picture doesn't say anything about the craters' depth)
I don't know about the source though, so I guess it should be taken with a grain of salt.