Download a file via FTP or HTTP. wget is invaluable for retrieving files
directly to the slug rather than downloading to a PC and then uploading to the
slug. It's also a handy way to get files from a server on the same network
provided it supports HTTP or anonymous FTP.
I'd been wondering if there was a way to do this...
"screen" allows you to disconnect from the slug and leave programs
running, then reconnect and pick up the session at a later time. It also
provides the ability to manage several sessions over one connection.
Produce a detailed directory listing rather than just filenames
Dumps contents of a text file to standard out (e.g. your screen usually), a
convenient way to get text data onto the PC if you set your terminal to log
everything. It's also useful in cron jobs.
Probably "disk-free", reports on disk capacity. Due to the complex way
drives are mounted the report can be tricky to interpret e.g.
Filesystem 1k-blocks Used Available Use% Mounted on
/dev/sda1 6528 5524 1004 85% /initrd
/dev/sda1 482214 99746 357569 22% /
/dev/sda1 482214 99746 357569 22% /dev/.static/dev
tmpfs 10240 32 10208 0% /dev
tmpfs 15148 3380 11768 22% /media/ram
lists running processes. This can be used to determine if a service is running
or not, though again it can be tricky to interpret.
Reads or sets the system (software) clock. This is volatile but should have been
set from the hardware clock when the system started.
Reads or sets the real-time (hardware) clock. This continues to operate when the
unit is powered down but probably keeps fairly poor time.
The hardware clock does not appear to have an alarm function and cannot be
used to wake the system at a scheduled time as far as I am aware.