I need to make an off-line copy of a wiki, with the following requirements.
1. Nothing outside wiki.website.com is copied.
2. Only webpages, images, css, and javascript files are copied.
3. It must be time-delayed, to prevent harm to the website. Like a 10 to 30 second delay on fetching each webpage.
I vaguely know that tools like wget or curl would be the best way to do this, but I don't know where to go from there.
1. Nothing outside wiki.website.com is copied.
2. Only webpages, images, css, and javascript files are copied.
3. It must be time-delayed, to prevent harm to the website. Like a 10 to 30 second delay on fetching each webpage.
I vaguely know that tools like wget or curl would be the best way to do this, but I don't know where to go from there.