Sep 182017

I love dashcam footage from vacation. Not only I get photos of various places I’ve visited but also route I used to get to them. And there are some beautiful drives out there.

My Garmin DriveAssist 50 outputs full HD baseline MP4 videos with constant bitrate that is slightly higher than 8 Mbps. As with all dashcam videos, the quality is not something to get excited about but it isn’t too bad either. My standard process is simply to re-cut these videos to throw out idle time (e.g. while parked). Since I don’t really use them for further editing, compression settings are a bit on aggressive side.

I start by customizing Internet HD 1080p profile as it has most of configuration as I want it. What I do change is Profile level by simply increasing it to High. I consume there videos on my PC and there is simply no reason I could think of to go lower. And yes, High profile works on mobile phone too.

Dashcam videos lend themselves quite well to MP4 compression so I went with average variable rate of 6 Mbps. For moments when a bit more action is happening, a maximum of 8 Mbps should suffice. Note that I use two-pass encoding here in order to squeeze a bit more quality despite lower bitrate. This does double the encoding time but I find it reasonable compromise.

I also uncheck progressive download support as I don’t intend to stream these and this does give a few more bits to encoder.

Lastly, as my camera doesn’t record sound, I turn the audio off.

Sep 162017

Moment I’ve been waiting for since Visual Studio 2017 was released is finally (almost) here.

While Community edition is more capable, restrictions on its use within anything but the smallest (or opensource) companies are quite limiting. While developers can (hopefully) get at least Professional or even Enterprise edition, the same cannot be said for people who might need access to code just occasionally or who enjoy making small internal tools. For them Express used to be god-given due to its quite permissive licence.

Preview version does have license restrictions on production use, but my expectation is that final version will have the same licence as 2015.

Sysadmins, rejoice!

Sep 132017

After reorganizing my ZFS datasets a bit, I suddenly noted I couldn’t copy any file larger than a few MB. A bit of investigation later and I figured why it was so.

My ZFS data sets were as follows:

# zfs list
NAME                            USED  AVAIL  REFER  MOUNTPOINT
Data                           2.06T   965G    96K  none
Data/Users                      181G   965G    96K  none
Data/Users/User1               44.3G  19.7G  2.23G  /Data/Users/User1
Data/Users/User2               14.7G  49.3G   264K  /Data/Users/User2
Data/Users/User3                224K  64.0G    96K  /Data/Users/User3
And my Samba share was pointing to /Data/Users/.

Guess what? Path /Data/Users was not pointing to any dataset as my parent dataset for Data/Users was not mounted. Instead it pointed to memory disk md0 which had just a few MB free. Samba doesn’t check full path for disk size but only its root share.

The easiest way to workaround this would be to simply mount parent dataset. But why go for easy?

A bit more complicated solution is getting Samba to use custom script to determine free space. We can then use this script to return available disk space for our parent dataset instead of built-in samba calculation.

To do this, we first create script /myScripts/sambaDiskFree:

DATASET=`pwd | cut -c2-`
zfs list -H -p -o available,used $DATASET | awk '{print $1+$2 " " $1}'

This script will check current directory, map its name to dataset (in my case it is as easy as stripping first slash character) and return two numbers. First is total disk space, followed by available diskspace – both in bytes.

Once script is saved and marked as executable (chmod +x), we just need to reference it in Services > CIFS/SMB > Settings under Additional parameters:

dfree command = /myScripts/sambaDiskFree

This will tell Samba to use our script for disk space determinations.

Sep 092017

If all went as expected, my fourth visit to Seattle Code Camp is currently in progress and my second talk is winding down just about now. If you decided to see me talk among more than 70 talks in 11 parallel tracks – thank you!

If not, here is what you missed:

My first talk was about my experience with Microsoft’s project Centennial, a way to Windows Store for classic desktop applications. It was based on my experience with getting Bimil to Windows Store.

Second talk is a bit of copout as it is rerun of my talk from last year. And no, it is not completely the same. I added a bit more ranting. :)

Slides are available for download but they won’t be substitute for attending conference.