Convert video to several JPG images on Linux without ffmpeg.

These days I just use this command and hit CTRL-C when the video frames (V:) stop moving:

mplayer -vo jpeg:outdir=screenshots -sstep 10 filename.mp4

But, this post remains for the sake of historical record – lol!


I admit… I do love PHP in the command line. Does that make me a bad person? 😉

Here’s a tiny little script that I wrote to create many JPG screenshots of a video file. I use this each week to create a bunch of stills from our broadcast so I can use them as thumbnails and so-on. I didn’t want it to depend on ffmpeg since I don’t have that on any of my modern systems.

It requires just three packages: mplayer mediainfo php-5

Save it as whatever.php and run it like this: php whatever.php file.wmv

It will create a folder called file-Screenshots/ and will save one picture per 10 seconds for any video source. Just change “file.wmv” to the name of your video. Include the path if it’s not in the current folder.

<?php
  // Depends: mplayer mediainfo
  // Does not need ffmpeg (deprecated)

  if ($_GET) {
    $file = $_GET['file'];
  } else {
    $file = $argv[1];
  }
  
  if (strlen($file) < 3) exit('Need a proper filename for input.' . PHP_EOL);  
  $dir = array_shift(explode('.',$file)) . '-Screenshots';

  $duration = duration($file);
  echo 'Duration in Seconds: ' . $duration . PHP_EOL;
  echo 'Saving to folder:    ' . $dir . PHP_EOL;
  echo 'Creating ' . ($duration/10) . ' JPG images from source...';
  exec('mplayer -vo jpeg:outdir=' . $dir . ' -sstep 10 -endpos ' . ($duration-2) . ' ' . $file . ' > /dev/null 2>&1');
  echo ' Done.' . PHP_EOL; 

  function duration($file) {
    if (file_exists($file)) {
      exec('mediainfo -Inform="Audio;%ID%:%Format%:%Language/String%\n" ' . $file . ' | grep -m1 Duration | cut -d\':\' -f2',$result);
      $tmp = explode('h',$result[0]);
      $seconds = ((intval($tmp[0]*60)+intval($tmp[1]))*60);
      return intval(trim($seconds));
    } else {
      exit('File ' . $file . ' not found.' . PHP_EOL);
    }
  }
?>

Hope it helps you out.

-Robbie

Find the version number of all WordPress installations on your Linux server.

I have a lot of customers running WordPress on our shared hosting servers, and sometimes they neglect to update their WordPress installs. [Rolls Eyes]

I need to know which of these sites are using an obsolete version of WordPress so I may contact the customer and warn them that they need to update their software.

So here’s a helpful little Linux command I whipped up and ran as root to go through my /home folder searching for all WordPress versions. I only had to run it as root because I am checking through all users’ folders, not just my own. If you only want to check your own user, you don’t need root access.

I ran this command from my /home folder on the Linux server:

find . -name ‘version.php’ -exec grep ‘$wp_version =’ {} /dev/null \; > /tmp/wordpress-versions.log

Breakdown:

  • find . -name ‘version.php’
    Search through the current folder, recursively, for any file named version.php. This is where WordPress stores the WordPress version number.
  • -exec
    Execute a command with each found item.
  • grep ‘$wp_version =’ {}
    Look within the found version.php file(s) in a loop for the term $wp_version = and output the result.
  • /dev/null
    Trick grep into thinking there is a second file, forcing it to precede the output with the filename provided by find
  • \;
    Close the find command.
  • > /tmp/wordpress-versions.log
    Save the results to a log file in /tmp. You can tail -f this file while scanning, or simply open or cat it when you’re done. Leave this portion out of the command if you’d rather have it output directly to your screen.

Memory leak in Zimbra 8.0.6 webmail

I’ve had a suspicion that since the Zimbra 8.0.6 update, something’s been wonky with Zimbra’s webmail client, so I decided to perform a very simple test: open Zimbra Webmail and leave it running.

Here is the outcome of that test.

Normal Operation for one business day.
This is how I operate day after day in my normal office environment.

Running:

  • Zimbra 8.0.6_GA_5922
  • Chromium Version 32.0.1700.123 Debian 7.4 (248368)

Memory Usage at Application Launch:

  • Browser window with Zimbra webmail client
    Thursday 8:47am – 139.5MB
  • Browser window with Google
    Thursday 8:55am – 45.92MB

Memory Usage ~ 24 Hours Later:
I left both browser windows running overnight. Here is where the memory usage stands…

  • Browser window with Zimbra webmail client
    Friday 8:28am – 564.3MB – 304% Increase in 24 Hours
  • Browser window with Google
    Friday 8:29am – 60.78MB – 32% Increase in 24 Hours

Memory Usage ~4 Days Later:
I left both browser windows running over the weekend.  Here is where the memory usage stands…

  • Browser window with Zimbra webmail client
    Monday 10:32am – 1.6 GB – 1,046% Increase in ~96 Hours
  • Browser window with Google
    Monday 10:33am – 60.53MB – 31% Increase in ~96 Hours

Since Zimbra cut the Evolution Connector from its product line, and the Zimbra Desktop software is still only available for a 32-bit platform, this leaves Zimbra operation on Linux sorely lacking. What has VMWare done?!  Hopefully Telligent can fix it.

-Robbie

*** UPDATE March 24, 2014 ***
We’ve ruled out Chrome by itself as the issue since it is only the window containing the Zimbra webmail client that shows any increase in memory usage.

To rule out browser extensions, I will run my next test with all Chrome extensions disabled.
Test 2 – Disable all Chrome Extensions and re-test.

Memory Usage After Weekend:

  • Browser window with Zimbra webmail client
    Friday 5:06pm – 106.3 MB

Memory Usage After Weekend:

  • Browser window with Zimbra webmail client
    Monday 8:37am – 1.7 GB

*** UPDATE April 15, 2014 ***
This has been added as a bug report for 88031 – https://bugzilla.zimbra.com/show_bug.cgi?id=88031

Preventing rsync from doubling–or even tripling–your S3 fees.

Using rsync to upload files to Amazon S3 over s3fs?  You might be paying double–or even triple–the S3 fees.

I was observing the file upload progress on the transcoder server this morning, curious how it was moving along, and I noticed something: the currently uploading file had an odd name.

My file, CAT5TV-265-Writing-Without-Distractions-With-Free-Software-HD.m4v was being uploaded as .CAT5TV-265-Writing-Without-Distractions-With-Free-Software-HD.m4v.f100q3.

I use rsync to upload the files to the S3 folder over S3FS on Debian, because it offers good bandwidth control.  I can restrict how much of our upstream bandwidth is dedicated to the upload and prevent it from slowing down our other services.

Noticing the filename this morning, and understanding the way rsync works, I know the random filename gets renamed the instant the upload is complete.

In a normal disk-to-disk operation, or when rsync’ing over something such as SSH, that’s fine, because a mv this that doesn’t use any resources, and certainly doesn’t cost anything: it’s a simple rename operation. So why did my antennae go up this morning? Because I also know how S3FS works.

A rename operation over S3FS means the file is first downloaded to a file in /tmp, renamed, and then re-uploaded.  So what rsync is effectively doing is:

  1. Uploading the file to S3 with a random filename, with bandwidth restrictions.
  2. Downloading the file to /tmp with no bandwidth restrictions.
  3. Renaming the /tmp file.
  4. Re-uploading the file to S3 with no bandwidth restrictions.
  5. Deleting the temp files.

Fortunately, this is 2013 and not 2002.  The developers of rsync realized at some point that direct uploading may be desired in some cases.  I don’t think they had S3FS in mind, but it certainly fits the bill.

The option is –inplace.

Here is what the manpage says about —inplace:

This option changes how rsync transfers a file when its data needs to be updated: instead of the default method of creating a new copy of the file and moving it into place when it is complete, rsync instead writes the update data directly  to  the destination file.

It’s that simple!  Adding –inplace to your rsync command will cut your Amazon S3 transfer fees by as much as 2/3 for future rsync transactions!

I’m glad I caught this before the transcoders transferred all 314 episodes of Category5 Technology TV to S3.  I just saved us a boatload of cash.

Happy coding!

– Robbie

Running phpcs against many domains to test PHP5 Compatibility.

Running a shared hosting service (or otherwise having a ton of web sites hosted on the same server) can pose challenges when it comes to upgrading.  What’s going to happen if you upgrade something to do with the web server, and it breaks a bunch of sites?

That’s what I ran into this week.

For security reasons, we needed to knock PHP4 off our Apache server and force all users onto PHP5.

But a quick test showed us that this broke a number of older sites (especially sites running on old code for things like OS Commerce or Joomla).

I can’t possibly scan through billions of lines of client code to see if their site will work or break, nor can I click every link and test everything after upgrading them to PHP5.

So automation takes over, and we look at PHP_CodeSniffer with the PHPCompatibility standard installed.

Making it work was a bit of a pain in the first place, and you’ll need some know-how to get it to go.  There are inconsistencies in the documentation and even some incorrect instruction on getting it running.  However, a good place to start is http://techblog.wimgodden.be…..

Running the command on a specific folder (eg. phpcs –extensions=php –standard=PHP53Compat /home/myuser/domains/mydomain.com/public_html) works great.  But as soon as you decide to try to run it through many, many domains, it craps out.  Literally just hangs.  But usually not until it’s been running for a few hours, so what a waste of time.

So I wrote a quick script to help with this issue.  It (in its existing form – feel free to mash it up to suit your needs) first generates a list of all public_html and private_html folders recursive to your /home folder.  It then runs phpcs against everything it finds, but does it one site at a time (so no hanging).

I suggest you run phpcs against one domain first to ensure that you have phpcs and the PHPCompatibility standard installed and configured correctly.  Once you’ve successfully tested it, then use this script to automate the scanning process.

You can run the script from anywhere, but it must have a tmp and results folder within the current folder.

Eg.:
mkdir /scanphp
cd /scanphp
mkdir tmp
mkdir results

And then place the PHP file in /scanphp and run it like this:
php myfile.php (or whatever you ended up calling it)

Remember, this script is to be run through a terminal session, not in a browser.

<?php
  exec('find /home -type d -iname \'public_html\' > tmp/public_html');
  exec('find /home -type d -iname \'private_html\' > tmp/private_html');
  $public_html = file('tmp/public_html');
  $private_html = file('tmp/private_html');

  foreach ($public_html as $folder) {
    $tmp = explode('/', $folder);
    $domain = $tmp[(count($tmp)-2)];
    if ($domain == '.htpasswd' || $domain == 'public_html') $domain = $tmp[(count($tmp)-3)];
    $user = $tmp[2];
    echo 'Running scan: ' . $folder . $user . '->' . $domain . '... ';
    exec('echo "Scan Results for ' . $folder . '" >> results/' . $user . '_' . $domain . '.log');
    exec('phpcs --extensions=php --standard=PHP53Compat ' . $folder . ' >> results/' . $user . '_' . $domain . '.log');
    exec('echo "" >> results/' . $user . '_' . $domain . '.log');
    exec('echo "" >> results/' . $user . '_' . $domain . '.log');
    echo 'Done.' . PHP_EOL . PHP_EOL;
  }

  foreach ($private_html as $folder) {
    $tmp = explode('/', $folder);
    $domain = $tmp[(count($tmp)-2)];
    if ($domain == '.htpasswd' || $domain == 'private_html') $domain = $tmp[(count($tmp)-3)];
    $user = $tmp[2];
    echo 'Running scan: ' . $folder . $user . '->' . $domain . '... ';
    exec('echo "Scan Results for ' . $folder . '" >> results/' . $user . '_' . $domain . '.log');
    exec('phpcs --extensions=php --standard=PHP53Compat ' . $folder . ' >> results/' . $user . '_' . $domain . '.log');
    exec('echo "" >> results/' . $user . '_' . $domain . '.log');
    exec('echo "" >> results/' . $user . '_' . $domain . '.log');
    echo 'Done.' . PHP_EOL . PHP_EOL;
  }

?>

See what we’re doing there?  Easy breezy, and solves the problem when having to run phpcs against a massive number of domains.

Let me know if it helped!

– Robbie

Walk-in Wifi Responder

Had a thought this morning that wifi could be used to do some pretty rad stuff… like detecting when I get home by noticing my iPod touch.

Since most of us carry wifi-enabled devices with us at all times, and most of us have them set to auto-connect once in range of our routers, I thought, why not use that data?  It could be as simple as logging coming and going, or as sophisticated as automatically turning on my favorite music when I walk in the door.  Or even adjusting the thermostat when I arrive home to save energy when nobody is around.

As a very brief proof of concept I whipped out a simple algorithm in PHP which can be run from any Linux computer on your network.

Usage:  php wifi-check.php –device=devicename

 0) {
      $tmp=@explode('=', $ping);
      $result=@explode('/', trim($tmp[1]));
    }
    if (count($result) > 0) {
      // Now we know the device is connected; do something
      echo 'Device active.' . PHP_EOL;
    } else {
      // Device is not connected.
      echo 'Device inactive.' . PHP_EOL;
    }
  } else {
    echo 'Usage: php wifi-check.php --device=devicename' . PHP_EOL;
  }
} else {
  echo 'This script is designed to be run from the Linux terminal, not a browser.';
}
?>

My thinking is to put something like this in a looping script and let it run every so many seconds or something, calling particular functions if the device is detected as active vs inactive.

I’d welcome your thoughts in the comment section below.  What practical things could this be used for?