Since I started use Hexo for my site allot of things have been easier. Like creating an API to access the site contents. It was a nightmare for me in Drupal 7 and in the end I gave up long ago but thanks to the work of John Wu and his plugin hexo-generator-api generating a REST-ful JSON API with read-only access to all the sites contents has been the the simplest things I’ve worked on so far.
There are some things I would like to do that aren’t available, but the code is released under the MIT license and I’ve got a fork on github I can work on.
In the mean time I’ve written up some documentation about what is available and how to start making use of it.
Even if you never plan on using my access points to access my own content, (to be fair I can’t see why you would, I just enjoy having it), I hope the documentation can help anyone else who is using the hexo-generator-api plugin on their own site.
In some situations a quick trip to get a drive is required but in many situations space can easily be reclaimed by removing the gunk that’s accumulated, but how do you determine what’s junk? Linux has the du command that will recursively search a director and list all files and there size but it still comes down to you to determine what should be kept and what should be removed.
In comes agedu (age dee you). Like du this new tool searches for files in all directories and lists there size, but it can also differentiate between files that are still in use and ones that haven’t been accessed less often.
From the man pages
agedu scans a directory tree and produces reports about how much disk space is used in each directory and sub-directory, and also how that usage of disk space corresponds to files with last-access times a long time ago.
In other words, agedu is a tool you might use to help you free up disk space. It lets you see which directories are taking up the most space, as du does; but unlike du, it also distinguishes between large collections of data which are still in use and ones which have not been accessed in months or years – for instance, large archives downloaded, unpacked, used once, and never cleaned up. Where du helps you find what’s using your disk space, agedu helps you find what’s wasting your disk space.
agedu has several operating modes. In one mode, it scans your disk and builds an index file containing a data structure which allows it to efficiently retrieve any information it might need. Typically, you would use it in this mode first, and then run it in one of a number of `query’ modes to display a report of the disk space usage of a particular directory and its sub-directories. Those reports can be produced as plain text (much like du) or as HTML. agedu can even run as a miniature web server, presenting each directory’s HTML report with hyperlinks to let you navigate around the file system to similar reports for other directories.
So, the install
Fedora 18, 19, 20 & 21
$sudo yum install agedu
-->Running transaction check
--->Packageagedu.x86_640:0-8.r9153.fc21will be installed
The first step is to let agedu scan a directory, bellow I’ve just scanned my Downloads folder:
Built pathname index,1032entries,96364bytes of index
Faking directory atimes
Final index filesize=190304bytes
To access the report you need run agedus built in web server:
Using Linux/proc/net magic authentication
Now just fire up your browser and go to the URL stated:
There are other options available such as --exclude and --include arguments which let you control what files are indexed, for example if you wanted to see what ISOs were taking up the most space you’d use: agedu -s ./ --exclude '*' --include '*.iso'
This post was designed to written to give you a quick overview of agedu since I have only touched on the options available. Check out the man pages or read thru the developers website for more details.
By default the processor in the Raspberry Pi runs at 700MHz, but it can be overclocked without voiding your warranty. Basically a processor is designed to do one job at time, be it retrieving something from RAM or adding to numbers together, its limited to one task. But when we’re using them the idea of one thing at a time is hard to get our head around since it appears to be doing so much more. That’s because a processor can do that one task really, really, really, fast. The clock speed, 700MHz, give us an idea of how many tasks it can do per second; the higher the speed the better performance you get.
Overclocking simply means increasing the clock speed past its defaults. The problem there is if you overclock to much the processor becomes unstable and can lead to crashes or even burn its self out.
My Raspberry Pi is running Raspbian so to overclock it simple type sudo raspi-config
Go down to item 7 Overclock and press ENTER, press ENTER a second time to confirm the warning message.
raspi-config has five levels of over clocking: 700MHz (no overclocking), 800MHz (modest), 900MHz (medium), 950MHz (high) and 1000MHz (turbo). All of which are supported by the Raspberry Pi foundation and will not void your warranty, over clocking to anything other than what’s on this list or overvolting the Raspberry Pi will void the warranty.
Select the level of overclocking you want from the list, as bellow, and click on <Ok> to confirm your selection.
After that your Raspberry Pi will need to reboot for the new settings to take effect. After a reboot you can test your settings by looking in /sys/devices/system/cpu/cpu0/cpufreq/scaling_max_freq
If for any reason your Raspberry Pi fails to boot after you’ve overclocked it hold down the shift key at boot time to temporarily disable overclocking then just go back into sudo raspi-config and select a lower speed.
As your can see for yourself I’ve rebuilt the site, again. I’ve code named this new iteration Alpe d’Huez, but it’s really version 8.
So NxFIFTEEN Alpe d’Huez, what’s the point? I’ve washed my hands of CMS installations. I started blogging almost from day one and used WordPress 1.0 then at some point switched to Drupal 5. The thing is, this site is so parse on content the over head of dynamically building pages for every visit isn’t required.
Over the years the requirements from my hosting provider has been steadily creeping up to get better and better performance from the site. At its worst, just part weekend, my Drupal home page took almost 80 seconds to load. That’s after implementing varnish and boost. For a site I update infrequently at best static HTML is perfect, so that’s what I went out to produce.
Possibly the biggest reason for looking at a Node.js solution was that is not written in PHP. I’m not against PHP like some, it’s a language I’ve been using since 1998 and it’s always served me well, but it’s time I learnt something new and node.js is really not just now.
So that’s my introduction to NxFIFTEEN. As you can imagine the transition took time and there were a lot of obstacles to surmounted but I’ll save them for other posts.