Keresés

Új hozzászólás Aktív témák

  • Thunderbc

    csendes tag

    válasz mereteycs #3098 üzenetére

    Hát nekem is vannak érdekes tapasztalataim. Tomatot ugyan nem akarta elfogadni, (felment a shibby-féle, de bejelentkezni nem tudtam, hiába vártam 1 órát, igaz nem is eröltettem), de a gyári uccsó firmware is mindíg lecsatlakozott a 4G-hálózatról. Ugyan úgy Huawei E3372, mint a kollégának. Böngészésnél tökéletesen működött, majd mikor egy komolyabb letöltés következett, lecsatlakozott. Persze a kezelő felületen azt írta, hogy minden csudaszép, és csatlakozik, de nem... Utóbbi időben már felcsatlakozni sem volt hajlandó, pedig a stick működik. Talán majd jó lesz ez egyszer...
    Köszönöm a tippet, én is megyek veszek egy TP-t.

  • Intruder2k5

    MODERÁTOR

    válasz mereteycs #3100 üzenetére

    Egyáltalán publikus IP címet kapsz a mobilnettel? Tudomásom szerint előszeretettel NAT-olnak a szolgáltatók, "az előfizetők biztonsága érdekében".

  • PainKiller13

    őstag

    válasz mereteycs #3164 üzenetére

    Köszi, akkor lehet csak frissíteni kellene a tomatom, az enyém 1.28-as.

  • Intruder2k5

    MODERÁTOR

    válasz mereteycs #7846 üzenetére

    Szia!

    Van entware, és abba telepített wget is, vagy csak a busybox-ba integrált verziót használod?

  • Intruder2k5

    MODERÁTOR

    válasz mereteycs #7848 üzenetére

    Az a wget szerintem nem is ismeri ezeket a paramétereket.

    Nálam az integrált verzió:

    root@INT-ROUTER:/tmp/home/root# wget --help
    BusyBox v1.23.2 (2015-10-08 23:00:41 CEST) multi-call binary.

    Usage: wget [-csq] [-O FILE] [-Y on/off] [-P DIR] [-U AGENT] [-T SEC] URL...

    Retrieve files via HTTP or FTP

    -s Spider mode - only check file existence
    -c Continue retrieval of aborted transfer
    -q Quiet
    -P DIR Save to DIR (default .)
    -T SEC Network read timeout is SEC seconds
    -O FILE Save to FILE ('-' for stdout)
    -U STR Use STR for User-Agent header
    -Y Use proxy ('on' or 'off')

    Az útvonala egyébként:

    root@INT-ROUTER:/tmp/home/root# type wget
    wget is a tracked alias for /usr/bin/wget

    Ehhez képest az entware verzió:

    root@INT-ROUTER:/tmp/home/root# /opt/bin/wget --help
    GNU Wget 1.16.3, a non-interactive network retriever.
    Usage: wget [OPTION]... [URL]...

    Mandatory arguments to long options are mandatory for short options too.

    Startup:
    -V, --version display the version of Wget and exit
    -h, --help print this help
    -b, --background go to background after startup
    -e, --execute=COMMAND execute a `.wgetrc'-style command

    Logging and input file:
    -o, --output-file=FILE log messages to FILE
    -a, --append-output=FILE append messages to FILE
    -d, --debug print lots of debugging information
    -q, --quiet quiet (no output)
    -v, --verbose be verbose (this is the default)
    -nv, --no-verbose turn off verboseness, without being quiet
    --report-speed=TYPE output bandwidth as TYPE. TYPE can be bits
    -i, --input-file=FILE download URLs found in local or external FILE
    -F, --force-html treat input file as HTML
    -B, --base=URL resolves HTML input-file links (-i -F)
    relative to URL
    --config=FILE specify config file to use
    --no-config do not read any config file

    Download:
    -t, --tries=NUMBER set number of retries to NUMBER (0 unlimits)
    --retry-connrefused retry even if connection is refused
    -O, --output-document=FILE write documents to FILE
    -nc, --no-clobber skip downloads that would download to
    existing files (overwriting them)
    -c, --continue resume getting a partially-downloaded file
    --start-pos=OFFSET start downloading from zero-based position OFFSET
    --progress=TYPE select progress gauge type
    --show-progress display the progress bar in any verbosity mode
    -N, --timestamping don't re-retrieve files unless newer than
    local
    --no-use-server-timestamps don't set the local file's timestamp by
    the one on the server
    -S, --server-response print server response
    --spider don't download anything
    -T, --timeout=SECONDS set all timeout values to SECONDS
    --dns-timeout=SECS set the DNS lookup timeout to SECS
    --connect-timeout=SECS set the connect timeout to SECS
    --read-timeout=SECS set the read timeout to SECS
    -w, --wait=SECONDS wait SECONDS between retrievals
    --waitretry=SECONDS wait 1..SECONDS between retries of a retrieval
    --random-wait wait from 0.5*WAIT...1.5*WAIT secs between retrievals
    --no-proxy explicitly turn off proxy
    -Q, --quota=NUMBER set retrieval quota to NUMBER
    --bind-address=ADDRESS bind to ADDRESS (hostname or IP) on local host
    --limit-rate=RATE limit download rate to RATE
    --no-dns-cache disable caching DNS lookups
    --restrict-file-names=OS restrict chars in file names to ones OS allows
    --ignore-case ignore case when matching files/directories
    -4, --inet4-only connect only to IPv4 addresses
    -6, --inet6-only connect only to IPv6 addresses
    --prefer-family=FAMILY connect first to addresses of specified family,
    one of IPv6, IPv4, or none
    --user=USER set both ftp and http user to USER
    --password=PASS set both ftp and http password to PASS
    --ask-password prompt for passwords
    --no-iri turn off IRI support
    --local-encoding=ENC use ENC as the local encoding for IRIs
    --remote-encoding=ENC use ENC as the default remote encoding
    --unlink remove file before clobber

    Directories:
    -nd, --no-directories don't create directories
    -x, --force-directories force creation of directories
    -nH, --no-host-directories don't create host directories
    --protocol-directories use protocol name in directories
    -P, --directory-prefix=PREFIX save files to PREFIX/..
    --cut-dirs=NUMBER ignore NUMBER remote directory components

    HTTP options:
    --http-user=USER set http user to USER
    --http-password=PASS set http password to PASS
    --no-cache disallow server-cached data
    --default-page=NAME change the default page name (normally
    this is 'index.html'.)
    -E, --adjust-extension save HTML/CSS documents with proper extensions
    --ignore-length ignore 'Content-Length' header field
    --header=STRING insert STRING among the headers
    --max-redirect maximum redirections allowed per page
    --proxy-user=USER set USER as proxy username
    --proxy-password=PASS set PASS as proxy password
    --referer=URL include 'Referer: URL' header in HTTP request
    --save-headers save the HTTP headers to file
    -U, --user-agent=AGENT identify as AGENT instead of Wget/VERSION
    --no-http-keep-alive disable HTTP keep-alive (persistent connections)
    --no-cookies don't use cookies
    --load-cookies=FILE load cookies from FILE before session
    --save-cookies=FILE save cookies to FILE after session
    --keep-session-cookies load and save session (non-permanent) cookies
    --post-data=STRING use the POST method; send STRING as the data
    --post-file=FILE use the POST method; send contents of FILE
    --method=HTTPMethod use method "HTTPMethod" in the request
    --body-data=STRING send STRING as data. --method MUST be set
    --body-file=FILE send contents of FILE. --method MUST be set
    --content-disposition honor the Content-Disposition header when
    choosing local file names (EXPERIMENTAL)
    --content-on-error output the received content on server errors
    --auth-no-challenge send Basic HTTP authentication information
    without first waiting for the server's
    challenge

    HTTPS (SSL/TLS) options:
    --secure-protocol=PR choose secure protocol, one of auto, SSLv2,
    SSLv3, TLSv1 and PFS
    --https-only only follow secure HTTPS links
    --no-check-certificate don't validate the server's certificate
    --certificate=FILE client certificate file
    --certificate-type=TYPE client certificate type, PEM or DER
    --private-key=FILE private key file
    --private-key-type=TYPE private key type, PEM or DER
    --ca-certificate=FILE file with the bundle of CAs
    --ca-directory=DIR directory where hash list of CAs is stored
    --crl-file=FILE file with bundle of CRLs
    --random-file=FILE file with random data for seeding the SSL PRNG
    --egd-file=FILE file naming the EGD socket with random data

    FTP options:
    --ftp-user=USER set ftp user to USER
    --ftp-password=PASS set ftp password to PASS
    --no-remove-listing don't remove '.listing' files
    --no-glob turn off FTP file name globbing
    --no-passive-ftp disable the "passive" transfer mode
    --preserve-permissions preserve remote file permissions
    --retr-symlinks when recursing, get linked-to files (not dir)

    WARC options:
    --warc-file=FILENAME save request/response data to a .warc.gz file
    --warc-header=STRING insert STRING into the warcinfo record
    --warc-max-size=NUMBER set maximum size of WARC files to NUMBER
    --warc-cdx write CDX index files
    --warc-dedup=FILENAME do not store records listed in this CDX file
    --no-warc-compression do not compress WARC files with GZIP
    --no-warc-digests do not calculate SHA1 digests
    --no-warc-keep-log do not store the log file in a WARC record
    --warc-tempdir=DIRECTORY location for temporary files created by the
    WARC writer

    Recursive download:
    -r, --recursive specify recursive download
    -l, --level=NUMBER maximum recursion depth (inf or 0 for infinite)
    --delete-after delete files locally after downloading them
    -k, --convert-links make links in downloaded HTML or CSS point to
    local files
    --backups=N before writing file X, rotate up to N backup files
    -K, --backup-converted before converting file X, back up as X.orig
    -m, --mirror shortcut for -N -r -l inf --no-remove-listing
    -p, --page-requisites get all images, etc. needed to display HTML page
    --strict-comments turn on strict (SGML) handling of HTML comments

    Recursive accept/reject:
    -A, --accept=LIST comma-separated list of accepted extensions
    -R, --reject=LIST comma-separated list of rejected extensions
    --accept-regex=REGEX regex matching accepted URLs
    --reject-regex=REGEX regex matching rejected URLs
    --regex-type=TYPE regex type (posix|pcre)
    -D, --domains=LIST comma-separated list of accepted domains
    --exclude-domains=LIST comma-separated list of rejected domains
    --follow-ftp follow FTP links from HTML documents
    --follow-tags=LIST comma-separated list of followed HTML tags
    --ignore-tags=LIST comma-separated list of ignored HTML tags
    -H, --span-hosts go to foreign hosts when recursive
    -L, --relative follow relative links only
    -I, --include-directories=LIST list of allowed directories
    --trust-server-names use the name specified by the redirection
    URL's last component
    -X, --exclude-directories=LIST list of excluded directories
    -np, --no-parent don't ascend to the parent directory

    Mail bug reports and suggestions to <bug-wget@gnu.org>

  • Intruder2k5

    MODERÁTOR

    válasz mereteycs #7973 üzenetére

    Szia!

    Miért ne lehetne? Ugyanúgy megosztható Samba-n és FTP-n is, mint bármely másik mappa.

  • Intruder2k5

    MODERÁTOR

    válasz mereteycs #7975 üzenetére

    Az USB and NAS -> FTP Server menüben alul a User Accounts részben, a Root Directory* mezőbe add meg a /jffs mappát, és készen is van.

  • Intruder2k5

    MODERÁTOR

    válasz mereteycs #7977 üzenetére

    Nem tudom, hogy a vsftpd követi-e a symlinkeket, főleg a root_dir-en kívül, de ha igen, akkor azzal is megoldható.

    Ha nem, akkor pedig hozz létre egy jffs nevű mappát abban a mappában, ami meg van osztva jelenleg FTP-n, és oda bindeld a /jffs mappát.

    Illetve alternatív megoldás, hogy megosztod a gyökér filerendszert, bár ez nem túl biztonságos.

    [ Szerkesztve ]

Új hozzászólás Aktív témák