In today's article, I manage to turn a single flag into an entire article!

KGIII

Super Moderator
Staff member
Gold Supporter
Joined
Jul 23, 2020
Messages
11,498
Reaction score
9,993
Credits
95,326
Yeah, it's not my finest hour, but I *do* share something useful - specifically how to resume an interrupted wget download.

For better or worse, this is pretty easy - though not something you'll need every day. When you do want it, it's pretty handy.


Yup... A whole article about adding "-c" to a wget command. I'm not sure if I'm getting better or if i'm getting worse!

Don't forget that feedback stuff.
 


NIce article, i'll have to read the other articles about wget soon. I'm interested because in "Wicked Cool Shell Scripts" there's a lot of info on web scraping with bash. That is very interesting to me but I haven't gotten very far in that book just because I keep studying C and I also get distracted by ideas that pop into my head. Those articles should help a lot with using commands to access the internet. Curl sometimes works when wget doesn't.
 
there's a lot of info on web scraping with bash

Yeah, you can use wget for that. You can also use httrack. You have to muck around with the settings to scrape larger WordPress sites, but it's possible.

Also, if you're scraping, use 'nc' (no clobber) so that you're not hammering on the site.

I currently have it disabled as a part of an A/B test, but my site has defenses against that. If you try hammering on it to download it, it'll view you as a malicious bot (after a while) and stop sending you useful information.

It's not that I mind the scraping, it's that I mind spending money on bandwidth. That and there's a lot of malicious bots out there. It's also a step in the right direction to mitigate DDoS attacks.
 
Yeah i agree that its best to be bandwidth considerate and i dont want to download someones whole website anyway, lol
 
I'm thinking about wrapping my article up into a .pdf and giving copies away to anyone who donates.

Then, a few months down the road, I can update it with the new content - and keep doing that every few months.

It seems like a good idea to me. I'll probably make it donation-optional, but my CDN is doubling their prices at the end of the month.
 
I'm thinking about wrapping my article up into a .pdf and giving copies away to anyone who donates.

Then, a few months down the road, I can update it with the new content - and keep doing that every few months.

It seems like a good idea to me. I'll probably make it donation-optional, but my CDN is doubling their prices at the end of the month.
You mean all or most of your articles, right? Sounds like a good plan to me.
 
You mean all or most of your articles, right? Sounds like a good plan to me.

All of them but the 'meta' articles (I hope). I want to exclude those, but it's a plugin that I'd be using and it's not all that refined.
 
All of them but the 'meta' articles (I hope). I want to exclude those, but it's a plugin that I'd be using and it's not all that refined.
I'd consider donating, but if i don't it's because i already feel overwhelmed by all the stuff i need to learn about computers...the labyrinthian aspect of it already feels pretty overwhelming, but i still do it as much as i can despite the fact i haven't solved the money question because i like it.
 
Oh, it's all good. I don't actually expect folks to donate. Regardless, the site will remain online. 'Snot like I'm gonna run out of money. There are worse ways to spend a few bucks on a hobby.
 
This site is being quite heavily advertised, is that really not even paying for itself?
 
This site is being quite heavily advertised, is that really not even paying for itself?

It gets posted once every couple of days. Sometimes people cite links, which is nice.

But pay for itself? No, no... Not really... LOL That's okay, it'll get paid for regardless.

The ads on it are not very profitable. AdSense is not nearly as profitable as people think.
 

Members online


Top