Hacker Newsnew | past | comments | ask | show | jobs | submit | bantersaurus's commentslogin

Time for a period of degrowth, the only way we are going to deal with depleting natural resources and global warming.


Probably carbs from veg and beans... Not wotsits


That guy at the end who changes into a vest is lols


Golf


China seems quite far ahead with their environmental policies.if only we also experienced the same smog to push our politicians into taking environmental politics more seriously


We did (to an extent) which is why we implemented the environmental laws we now have. But living so long without the severe negative environmental impacts leads us to slacken our resolve, because it's easy to think we got to this point for free.


We did. And that’s why we do.


Question should be is America a developed country


Haha, I felt disturbed by the title and conjured the same counter-question. When has the word "development" been more conflicted? I think we (or at least I) don't want to see it used anymore in the sense of "short-term economics that don't account for common well-being or ecology".


How long is this article????


Just under 7,000 words, the first three of which are "The long read".

This identifies the piece as being long-form journalism, even without reading the text. Quoting https://en.wikipedia.org/wiki/Long-form_journalism .

> Long-form journalism is a branch of journalism dedicated to longer articles with larger amounts of content.[1] Typically this will be between 1,000 and 20,000 words.


It's literally called "The long read", you can't really be surprised.


Im using instagram instead...


beautifulsoup


Also good is RoboBrowser which combines beautifulsoup with Requests to get a nice 'Browser' abstraction. It also has good built-in functionality for filling in forms.


Using this as well with Requests to automate eBay/gumtree/craigslist. Works very well


Any details on this anywhere, or is it not for public consumption? I'm just getting started in Python and want to do something with Gumtree and eBay as an idea to help me in a different sphere.


It's not really for public consumption because it's embarrassingly badly written :)

It's pretty dumb really. Just figured out the search URLs and then parse the list responses. It then stores the auctions/ad IDs it has seen in a tiny redis instance with 60 days' expiry on each ID it inserts. If there are any items it hasn't seen each time it runs, it compiles them in a list and emails them to me via AWS SNS. Runs every 5 minutes from cron on a Raspberry Pi Zero plugged into the back of my XBox 360 as a power supply and my router via a USB/ethernet cable.

The main bulk of the work went into the searches to run which are a huge list of typos on things with a high return. I tend to buy, test, then reship them for profit. Not much investment gives a very good return - pays for the food bill every month :)


Thanks for the info - I'm sure mine will be of lower quality when I do write it - hoping to compile real-world info on sold vehicles by scraping info from eBay and Gumtree, but that will take time and more skills than I currently possess. Good to hear someone's made something out of a similar idea, though.


Sounds like a good idea. Good luck - you can do it! :)


Street cred


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: