At home, I use Qubes. On a regular OS, any program you are running can
log your keystrokes and take screenshots - yikes! How am I supposed to
trust every single piece of software running?! Qubes takes the
virtualization technologies that protect us on servers and applies
them to the desktop user. Each activity you do (e.g. general web
browsing, work, banking) is compartmentalized into it's own VM. If you
get a virus/etc in one, it can't jump into the others. If you download
a pdf from an untrustworthy source (or download some sketchy
software), you can open it in a temporary vm, read it, and when you
are done the whole thing seamlessly disappears. If anyone wants to
know more, please hit me up.

At work, I develop on both Mac (with MacPorts) and Linux. Most of my
work takes place on Centos servers running via VMware vSphere/ESX. I
work on server files using local IDEs/text editors via sshfs.

I'm still looking for the best Linux distro. I was a big Arch fan but
their package manager constantly loses information (if you don't
constantly update, your system will break). Rolling release is great
but the way they do it is a deal breaker. To me open transparent
governance is important which makes me quite interested in Debian.
Gentoo is also of interest. Suse also seems promising but I haven't
tried it since the 90s (does anyone in the US actually use it?) I
currently mostly use Fedora or Centos in VMs (on servers or in Qubes).

I was poking around with SaltStack but don't use it in production.
Lately I've been excited about Red Hat's OpenShift. It's a PaaS (you
provide the software to run, they provide the rest). For free, you can
try out containerized development and continuous integration without
having to go through the enormous burden of installation and

I'm into the idea of having a recipe that you can actually run to
setup a software environment. Still, to me, a lot of the tools and
workflows are still maturing (e.g. downloading images off of randos on
the internet, running everything as root, bugs, etc). Plus I see a lot
of shops spending so much time setting up their infrastructure and
keeping it running, it's not clear that they are actually gaining
anything. We aren't running at Google scale; I doesn't always make
sense to model our infrastructure off of them.

Tom #rant4lib Hutchinson

On Tue, Jan 16, 2018 at 12:00 PM, Jonathan Rochkind <[log in to unmask]> wrote:
> I use a mac for developing, and don't use it as a sort of terminal tool, I
> develop _on_ the mac.  I install whatever I need there. MacOS is a kind of
> unix, and `brew` usually gives me whatever I need.  But I don't do things
> that my local macbook doesn't have the CPU power for.
> Then I deploy to a staging server, and ultimately to production, using
> automated deploy tools. I never edit things directly on the production or
> staging server.
> The staging/production servers are always some kind of linux, I don't
> really care if they are VMs or whatever. On my current project, they are
> all AWS resources.
> Jonathan
> On Mon, Jan 15, 2018 at 10:52 AM, Eric Lease Morgan <[log in to unmask]> wrote:
>> I’m just curious. What sorts of computing environments do y’all
>> use/exploit?
>> For a long long time I used my Macintosh as a sort of terminal tool
>> connected to a Unix/Linux computer where I did my “real” computing.
>> Now-a-days, I still use this set up, but the Unix/Linux environment is
>> increasingly a virtual machine, a “large” multi-core computer, or a
>> “cluster”. More specifically, I have learned to increase my throughput with
>> parallel processing and/or cluster computing. This is advantageous because
>> I do dozens of processes against 10’s of thousands plain text files, and
>> doing such work on against a single CPU is not feasible. Map/reduce is a
>> good thing! Because of parallel processing a cluster computing, I have had
>> to change some of my programming paradigms. The whole thing is very
>> interesting.
>> What sorts of environments do y’all use?
>> —
>> Eric Lease Morgan
>> University of Notre Dame