I use screen for my command-line tasks while managing the servers where I work. I usually run small commands (mostly file-system tasks) but sometimes I run more extensive tasks (like DBA). The output of those tasks is important to me. Since I use Ubuntu and OS X (both Terminal Windows) for my tasks, yet I need to use screen, the scrolling is not available, so any long output (think a 500-row table
I have a PC with Intel(R) Pentium(R) CPU G640 @ 2.80 GHz and 8 GB of RAM. I am running Scientific Linux 6.5 on it with EXT3 filesystem. On this setup, what is the fastest way I can do a sort -u on a 200 gigabyte file? Should I split the file into smaller files (smaller than 8 GB), sort -u them, put them together, then split them again in a different size, sort -u again, etc.? Or are there any sort
TL;DR or "Just scorch my pi" sudo apt-get remove --auto-remove --purge 'libx11-.*' sudo apt-get autoremove --purge (Repeat apt-get autoremove --purge until no orphans remain) Further explanation If a package foo depends on another package libfoo and you remove the libfoo package, the dependent (foo) is also removed. Because Foo has a depends line specifying libfoo, it would be broken to leave foo
リリース、障害情報などのサービスのお知らせ
最新の人気エントリーの配信
処理を実行中です
j次のブックマーク
k前のブックマーク
lあとで読む
eコメント一覧を開く
oページを開く