I've started a video series on antiX 15. The playlist is here:
========= SCRAPER REMOVED AN EMBEDDED LINK HERE ===========
url was:"https://www.youtube.com/playlist?list=PLTRkAa6x1htWRH6HRBlumycA-d9akkjfZ"
linktext was:"https://www.youtube.com/playlist?list=P ... A-d9akkjfZ"
====================================
Individual videos:
antiX 15 - What's New
^---- embedded YouTube-hosted video: https://www.youtube.com/JRO-meyYGWg
antix 15 - Personal Menu & MenuManager
^---- embedded YouTube-hosted video: https://www.youtube.com/wwuTtjF6zS0
antiX 15 - Streamlight
topic title: antiX 15 Videos
5 posts
• Page 1 of 1
-
Posts: 2,238
- Joined: 16 Dec 2007
-
Posts: 4,164
- Joined: 20 Feb 2009
#2
Yay! You da man D.O. as usual. From a subscriber. __{{emoticon}}__
-
Posts: 1,445
- Joined: 09 Feb 2012
#3
d_o, you mentioned in the"what's new" video that you haven't used FTP in recent years, so here are some quick points FYI.
Just as"putty" is handy for keeping a database of connection details/auths for various destinations,
most FTP client apps similarly keep a"l'il black book" containing those details so that you can simply click-and-connect
vs manually fiddling with filling in auth fields during each (connection) session start.
Some of the per-destination details which are saved/recalled per each"l'il black book" entry are real timesavers.
For instance, upon connection, your specified (or optionally, last-used) local and remote directories are autoselected.
FTP transfer is noticeably faster than, has less overhead compared to, HTTP/S protocol.
Twice as fast? Ten times as fast? I don't have off-the-cuff stats to quote, but if you're downloading large file(s) or bulk transferring small files
and the remote offers both http and anonymous ftp access, choose ftp (in browser) and you'll probably notice a dramatic speed increase compared to
what you had expected, compared to previous http downloads from that same remote destination.
Downloads via FTP are more"reliable", especially when you're moving myriad small files. Even though webserver/client can negotiate"octet-stream" encoding
for select files, where appropriate... too often, misapplied content-type headers (dependant on webserver configuration) leave you chasing down"broken"
binary files which have been truncated.
"FTP" is the generic term. Nowadays, aside from anonymous FTP intended for websurfing (in-browser) downloaders, most connections are sFTP.
On shared hosting, many (probably most) hosting providers outright disable/disallow port 21 globally, forcing use of port 22 sFTP instead.
Similarly, many (probably most) webhosting providers are reluctant to offer SSH connectivity.
So, from a webdeveloper perspective at least, a desktop FTP client is still (nowadays) an indispensible tool.
Just as"putty" is handy for keeping a database of connection details/auths for various destinations,
most FTP client apps similarly keep a"l'il black book" containing those details so that you can simply click-and-connect
vs manually fiddling with filling in auth fields during each (connection) session start.
Some of the per-destination details which are saved/recalled per each"l'il black book" entry are real timesavers.
For instance, upon connection, your specified (or optionally, last-used) local and remote directories are autoselected.
FTP transfer is noticeably faster than, has less overhead compared to, HTTP/S protocol.
Twice as fast? Ten times as fast? I don't have off-the-cuff stats to quote, but if you're downloading large file(s) or bulk transferring small files
and the remote offers both http and anonymous ftp access, choose ftp (in browser) and you'll probably notice a dramatic speed increase compared to
what you had expected, compared to previous http downloads from that same remote destination.
Downloads via FTP are more"reliable", especially when you're moving myriad small files. Even though webserver/client can negotiate"octet-stream" encoding
for select files, where appropriate... too often, misapplied content-type headers (dependant on webserver configuration) leave you chasing down"broken"
binary files which have been truncated.
"FTP" is the generic term. Nowadays, aside from anonymous FTP intended for websurfing (in-browser) downloaders, most connections are sFTP.
On shared hosting, many (probably most) hosting providers outright disable/disallow port 21 globally, forcing use of port 22 sFTP instead.
Similarly, many (probably most) webhosting providers are reluctant to offer SSH connectivity.
So, from a webdeveloper perspective at least, a desktop FTP client is still (nowadays) an indispensible tool.
-
Posts: 850
- Joined: 26 Jul 2012
#4
I tend to use wget -c for grabbing files from the web, when I can.
-
Posts: 2,238
- Joined: 16 Dec 2007
#5
@fatmac - thanks for the ftp primer. used to use it a lot back in the days before dropbox and similar services to move files between engineering outfits.