Downside of DIY

Posted by on Mar 27, 2017 in Hardware

In the long war between the PC Master Race and the Console Peasants (their words, not mine), PC aficionados point to their platforms modularity as being one of its overarching strongpoints. Each console generation, they say, limits users to whatever is in the box. When technology changes the only recourse is to buy a whole new console at full price — often while the current console generation is a long way from being obsolete. PC users, on the other hand, can upgrade piecemeal over time; while the initial outlay far outstrips the cost of a single console, it’s easier to upgrade over time where it counts: a new video card here, more storage capacity there.

Unlike building something like furniture, buying PC parts to build a custom platform has its own dangers that consoles will never (should never) see. To do this, individual parts need to be ordered and arrive in sealed packages. In all honesty, the act of building the PC — putting the parts together — is simple and is well documented both in writing and on the Internet (you do have another PC or tablet that you can use to watch YouTube videos in the event of an emergency, right?). What usually throws things off kilter is the quality of the parts.

Buying an “off the shelf” PC from IBuyPower or Falcon or Alienware is generally looked upon as a cardinal sin by the PC Master Race that views the act of putting parts together in the same way Jedi view building a custom lightsaber: it’s a rite of passage and a display of both mastery and an expression of individuality. What they don’t care to take into account, though, is that pre-built systems usually go through an intense “burn in” period where the assembler lets the system run intensive operations for an extended period of time to ensure that everything is working as intended. When building a custom PC in one’s own home from parts that arrive individually shrinkwrapped, it’s up to the end-user to perform that burn in and should anything arise from the process, it’s up to the end-user to deal with the issues.

This past week I had run into the dreaded (and increasingly rare) BSOD — Blue Screen of Death. It always seemed to happen after playing Mass Effect: Andromeda, which is probably the most taxing application I currently have. Often the BSOD would pop up after about an hour of gameplay, sometimes when just the game was running but sometimes when I was trying to work with the streaming platform Lightstream (which was something that occupied a lot of time this weekend).

BSODs are cryptic and despite the notion that it might be helpful, aren’t very user-parsable. I downloaded some memory dump readers from Microsoft that could help decipher these errors, and I think I had narrowed it down to an issue with the RAM — one of the parts that is always one of the most obvious culprits when things go belly-up. I downloaded and ran an app called memtest+ which boots into a Linux partition and puts the RAM through several tests and reports on error conditions, and when I returned, I found this:

The particulars aren’t important; know that “red is bad…very, very bad” and there’s a lot of red there. Having suspected that the RAM was at fault, I overnighted the same RAM from Amazon and had it sitting on the desk for its inevitable deployment. After the memtest results I took the system down and swapped the RAM, booting straight back into memtest and running it on the new memory which passed without any issues. I have yet to actually try putting the system through the wringer again (i.e. Mass Effect) but the fact that the original RAM showed errors and the new stuff did not gives me a high probability of success.

The problem is that while the RAM was obviously bad…what if the BSODs were a result of something else, and RAM just became the scapegoat? I’ve been monitoring CPU temperatures when playing The Elder Scrolls Online last night and they’ve all been around 40-60 degrees Celsius (for an i7-7700K that’s supposedly a good range). Voltage for the CPU and the RAM is also within normal operating limits as I’ve not gone anywhere near the overclocking abilities of the motherboard. I would hate to have the problem be my video card, although when I was in the case working with the RAM I realized that I had never secured the card to the rear plate with screws, so I fixed that oversight immediately to ensure that the card didn’t rock itself out of it’s PCI slot.

Last time I upgraded my PC I bought an off-the-shelf system because this kind of situation is exactly the kind of thing I didn’t want to have to deal with. Of course, that was almost 10 years ago, and we now live in the age of Amazon Prime and no-hassle returns. After posting this screenshot on Twitter, Belghast mentioned that he was sad that the age of local PC parts stores has faded into memory (no pun intended). It would have been a lot easier if I could jump in the car and drive down to a shop to swap parts, or at least to have had them test the RAM before I bought it. Amazon may be convenient, but it’s still a step back from where we used to be when CompUSA and Computer City were a thing in my area.

My PC will now destroy any console currently on the market (it remains to be seen with Scorpio) but at what cost? I do prefer the PC for gaming for many other reasons, but the potential for hardware issues straight out of the box is a major strike against it at least until the parts have gone through their proper burn in phase.

Read More »

New PC: The Aftermath

Posted by on Mar 20, 2017 in Gaming, Glamour Shots, Hardware, Software

New PC: The Aftermath

 

The “before” picture

I finally got around to building my new PC on Saturday morning. I woke up early (about 6:45 AM) in part because I’m getting old and old people eat dinner early and also wake up early, and because it was like gawddamned Christmas. I could tell that it had been quite some time since I’d last built a custom PC, because it took me hours for several reasons. First, the motherboard is a gaming motherboard, which means there’s a lot more gee-gaws on there in the event that the builder wants to overclock, side clock, cock-block, and rock around the clock. Second, the case I had bought was nicer than a sub-$100 case had any right to be, complete with a removable side panel that allowed me to route 98% of the cables out of sight to create the cleanest looking internal configuration I’ve ever had the pleasure to build. Third, I didn’t want to put an optical drive in there because everything is on the Internet now, except that for some reason this board’s integrated network port didn’t work at all without drivers…which were on the DVD that came with the motherboard. I had to open the side-panel, disconnect one of the HDDs and wire in a spare DVD to get the install media that would let me get the PC online.

NSFW

To say that I’m pleased with this PC is an understatement. It used to be that my PC was the loudest thing in the room, what with its aging cooling system ramping up the fans just because the day ended with the letter “Y”. Now, I can hear all kinds of other things from the corners of the basement (and I’m kind of worried about the state of my house as a result) because the fans are software controlled, and the water cooling system is whisper-quiet. Honestly, I still expect to hear the fans whine at certain points of operation and it takes me a second or two to realize why I’m not.

My biggest problem? Finding something that I have that will put the system through its paces. The system I was replacing wasn’t deadweight; it could still handle pretty much everything I’d thrown at it, but looking over the recommended requirements for Mass Effect: Andromeda made me realize that I was only a micron away from falling away from the trailing edge of what I’d be able to run very, very soon.

The first Big Test was probably the biggest game I have that would yield true results: Star Citizen. I had been able to run SC, but not all that well, with visual lag coupled with the motion blur that can’t be turned off resulting in a real headache for me. On the new system, though, SC ran like a real game. I was able to sprint through Port Olisar, jump into my Connie and take off. Moving through the ship was a breeze, and I was even able to get back into the ship when I accidentally shot myself out of the airlock without worrying about mistiming due to lag.

Not representative of temps, but the number of control panels this motherboard offers is staggering

As a consequence of picking up a GTX 1070 from Newegg, I scored a copy of Ghost Recon: Wildlands, a game I’d been cool on, but interested if I could play with others. This game ran exceedingly well and only notched the CPU up to about 68C/154F which from what I’m seeing is either average for an i7-7700K with water cooling, or is on the lower side. In light of that, then, the only issue I ran into thanks to testing with GR:W was with the fact that I’m still using physical platter HDs.

I have an SSD for my main OS drive, and I try very hard not to install anything there, and I have moved all of my high-access content to one of the physical drives (page file, Documents, Downloads, Videos, etc). Everything else is installed to one of these two 500GB physical drives: one specifically for games, and the other for everything else. When running GR:W, then, the only issues that cropped up occurred when the game needed to access data from the disk. It hitched and paused for a few seconds which for an RPG might be OK, but for a game requiring a smooth experience so as not to end up dead, this was nigh unacceptable. Defragging the HD (remember that?) helped, but it got me thinking about what’s called an M.2 SSD.

How retro!

On newer motherboards, there’s a slot for a device that looks like an old-school stick of RAM with its green board and exposed microchips. The port itself is generic, accepting anything from SSDs to wireless and Bluetooth cards. The benefit of an M.2 SSD is that its bus is supposedly faster than a conventional drive hookup (on my board, it’s a 6GB/sec SATA connection), but it’s also low-profile and requires no cables for connection or power supply.

So I’ve been considering adding an M.2. drive to this system, but it’ll have to wait because I’ve already spent as much as I’m able on this system at this time. Alternatively, I’m now in a place where I can consider whether or not to get on the VR bandwagon (or more accurately the much smaller VR Red Ryder wagon). I looked over what kinds of games on Steam require VR, and came away pretty unimpressed. I have Elite: Dangerous already, and while I know what a boon having head tracking is (thanks to having used TrackIR with the game), I’m not sure shelling out hundreds for a VR setup would be worthwhile just for Elite. Other promising items like Star Trek: Bridge Crew sound absolutely amazing on paper, but since I don’t know too many people who also have VR setups, I’d have to play with *shudder* the general population. Really, right now I’m thinking that I should shelve VR until V2.0 or if I’m hell bent on it for some reason, to look at lower cost versions like PSVR (which seems to have a better lineup than what I saw on Steam).

Read More »

A Long Time Coming – A New PC

Posted by on Mar 14, 2017 in Featured, Hardware

A Long Time Coming – A New PC

 

My current system is about 8 years old. It’s an Alienware (ignores boos from the crowd) and has had some minor upgrades over time — video, RAM, SSD, etc. It can still handle a good 90% of anything I’ve thrown at it but looking at specs for games like Mass Effect: Andromeda and Ghost Recon: Wildlands, as well as the perennial slog that is Star Citizen, I realize that the end of the road is well in sight.

Last year was supposed to be The Year, but my annual bonus had to be applied to our taxes and there was nothing left over to spend on a new PC. This year, however, checking off a different box on our tax forms landed us a nice little refund, which left my bonus available to use on upgrading (with appropriate permissions from The Boss, of course).

Here’s the damage:

  • Intel i7-7700K processor: I like to future proof, which is why my current machine lasted so long. The 7700K was just recently released and is roughly the same price as the 6700K.
  • MSI Z270 Gaming M7 1151 motherboard: This is probably way more hardware than I need. I’m not usually an overclocker, but in the interest of possibly squeezing out extra performance later on down the line, I figure that this mobo and the 7700K processor will let me last longer with this config than I would be able to otherwise.
  • Corsair Vengeance 16GB DDR4 3000 RAM: I debated 16 or 32GB, but this is something I can always add to later on, so in the interest of offsetting the cost of the mobo, I went with 16GB for now.
  • Corsair Hydro H100i liquid cooling: Because now that I have had liquid cooling, I can’t go back to normal fans.
  • Corsair CXM750M BRONZE PSU: Again, way more than I’ll need, but when I need it, I’ll most certainly have it.
  • EVGA GeForce GTX 1070 SC ACX 3.0 Black Edition video: Video cards suck, because not only do they release new ones all the time, but each manufacturer has at least 8 different variations. Again, this is a “superclocked” version with a core clock speed of 1594MHz which is better than the original Founders edition I had been looking at, and it has the EVGA’s “ACX 3.0 fancy-ass cooling system”, which I’m sure is a way for them to charge more for their card.

The grand total was $1,264.94, most of which was the processor, the video, and the motherboard.

You may notice that some elements are missing, like the hard drives and a case. I have about 20 different cases in my house right now, including a nice one that is being used for my daughter’s PC which sits dormant 98% of the time (the other 2% is when I use it to run a Trade Wars server). I plan on moving her stuff to a lesser case so I can use the nicer one. I will also be re-purposing the 250GB SSD and the two 500GB platter drives in my current system. I anticipate having to upgrade the SSD in the next few years, if not sooner, because I’ve had it for a few years now, and although I only use it for the OS, the read/write limits are never far from my mind.

Read More »

Revenge of the Automated Home – OSRAM Lightify

Posted by on Nov 7, 2016 in Hardware, Other Geeks

Revenge of the Automated Home – OSRAM Lightify

The Family was down at Lowes this weekend because we’re finally getting around to using the spare room upstairs, and by “using the spare room” I mean “remembering it exists and we really should clean it out and paint it and use it for something other than a random item graveyard.” The use in question: crafting room. My daughter is slowly (oh so slowly) getting her feet wet with cosplay and is looking for a permanent place for the sewing machine. My wife has a bunch of random craft things — jewelry making, Cricut paper cutting, et al. — which is ensconced in the basement, but I’m in the basement trying to realize my vision of turning it into a home theater, and last time I looked there was no “crafting nook” at the local cineplex. So the crafting stuff is going upstairs, once the spare room has been cleaned and painted. Hence the trip to Lowes; for paint.

While there, I snuck over to the lightbulb section because its time to come clean: I’m newly obsessed with LED bulbs. Now that they’ve come down in price I can seriously start looking at them. We have two outside the front door, and they burn like the noonday sun. I like it, but I think the neighbors object. Anyway, while ogling the bulbs, I noticed something called the OSRAM Lightify Smart Home RGBW Lighting Strips. 

lightify-flex-rgbw-1-55152d8bb892c

We’ve had a problem since we moved into this house: we’re the second owners, and the first owners who had it built cheaped out on pretty much everything. The owner fancied himself a handyman, and he finished the basement himself, added a three season porch on top of the raised porch (without drywalling, adding electrical, or staining the porch first), and started a fourth basement in the basement before they had to beat cheeks and find another home for some reason. Much to our chagrin, the kitchen has some really crappy lighting, especially around the counter area. The best case scenario would be to get some cannister lights installed in the ceiling, but I value my life and wouldn’t dare try such a feat myself, and we’re not financially flush enough to hire someone to do it. Seeing this lighting strip, however, ticked two boxes for me. One, they would work great under the cabinets and over the counter, and two…home automation!

See, back in our previous house, I wired the place up with X10 gear. And I mean wired: I replaced the light switches and power sockets with addressable X10 switches and power sockets, and I ran a server in the basement that I could access remotely (this was before smartphones, so be more impressed). This was great because I could turn the outside lights on if we’d forgotten to do so and weren’t home at night, and could turn the Christmas tree on and off before getting out of or after getting into bed. I felt like a god with the power over electricity in my house, although this was the era of X10 which, if you were alive and Internet-enabled back then, you remember as being some of the first and most prevalent ads ever to rock a popover browser window. The remote control was a square box with a lot of ungainly buttons, but it worked, and after many years without it in this house, I missed it. 

The RGBW lights worked like a champ. They’re nice and bright, and a single strip is long enough to cover one stretch of the counter where we do the most work. There was a problem, though: the set we bought wasn’t enough to use the “home automation” aspect of the toy tool. For that, we needed another piece of gear, the Hub. 

7935164_sd

This afternoon I snagged the hub-and-bulb kit because it was only $10 more than the hub alone, and a single bulb was actually $14. I raced home and plugged the hub into a wall socket — any wall socket — and downloaded the smartphone app to get the party started. 

This would actually turn out to be a party thrown by someone who’d never seen a party. First, the app wanted me to scan the QR code on the back of the hub because that registered the serial number of the device. Then I had to make an account. I got an error, but the account went through. I had to switch my phone’s network to the private network of the hub, and then use my phone as a handshake between the hub and my home network (“hub, this is network, network, this is hub”). This took a few tries because I have a wifi extender that I have apparently forgotten the password for, and then the hub and the network wouldn’t talk once I got the right wifi SSID set up. I called the app some bad names, and it decided to cooperate, so chalk one up for foul language. 

Next, before I could start lighting things up (literally), I had to upgrade the firmware on both the hub and the light strip. The strip itself has a small wifi receiver, but it’s invisible to everything except the official Lightify hub, and the hub said everyone needed some new clothes. Another nightmare involving a phone reboot, more harsh language, and some time spent on the Xbox later, I managed to get everything registered and ready to go. 

Where did this go, exactly, you might ask. Well, the light strip doesn’t have a switch. That’s OK because the point of this system is that it’s entirely modular. They sell switches that you can stick anywhere, so I could get a dedicated switch and put it next to the stove, for example, and control these lights. Right now, though, I can only turn the lights on and off from the smartphone. Not ideal, but not a deal breaker either. Although it’s not really useful for the kitchen, this lighting strip can modulate it’s color through the app. There’re several different built-in lighting schemes I can apply automatically, which got me thinking…but I’ll cover that later. The strip is dimmable, and when I turn it off it doesn’t just snap off; it dims itself to darkness. Pretty sexy! I can also set schedules, so I set the counter light to turn on at 5PM and turn off at 10PM, just in case we need it. It’s a small-watt user, so I don’t anticipate it being a massive energy hog. 

colorselection

The bulb ended up in a lamp in the basement, because why not? This also needed a firmware update, but is only dimmable and does not change color, although there are bulbs in this series that do. 

Overall, I give the idea an A+. The construction of the equipment also gets an A+. The software on the smartphone gets a resounding D because it’s a piece of crap. The fact that I got it working at all saved it from an F-. The good news is that these satellite items aren’t proprietary, and there are other “hubs” that I could get that would work just as well, and even better if the apps to control them are better. 

So, thinking about it at work today, I realized that now that I have this hammer of home automation lighting, everything in the house looks like a nail. I have Big Plans to work in the home theater area in the basement, and these strip lights are the perfect low-light, walkway style runner lights you see in more professional theaters. That they dim is a massive plus, as we can “dim the house lights” before the movies start. If I were into overkill, I could get the RGBW bulbs and put them around the house and when we have, say, a Christmas party, I could change things up to red and green and probably destroy retinas as a result. Maybe not such a good idea. But I could!

Read More »

Augmented Versus Virtual Reality

Posted by on May 6, 2016 in Editorial, Hardware

Augmented Versus Virtual Reality

As someone who lurvs technology, virtual reality has always been a Holy Grail. Of course, the end-game of being able to plug a cable directly into the brain to not just see a generated reality, but to experience it is a long ways away (a matter of when, not if) so in the meanwhile I thought I could subsist on the Oculus and the Vive.

On one side, there’s Rog Dolos who was lucky enough to score a Vive and has been providing a steady stream of personal experience accounts and insights into the current generation of VR experiences. Because I don’t have either a capable machine nor the money to pick up a Vive of my own, I’ve been living vicariously through Rog’s posts.

On the other side, Pete of Dragonchaser’s fame has been raising many good, common-sense concerns about the way we’re being asked to use VR. Specifically, how it’s encouraging an inadvertent anti-social experience which is not just difficult to share with other people around us, but puts us into a capsule which makes things really difficult for other people who might want to get our attention (without scaring the crap out of us).

Which is why Pete’s recent posting of a certain Facebook video really caught my attention and made me think that augmented reality might be the better option than virtual reality.

The benefits of AR were immediately obvious: you aren’t sequestering yourself in a closed environment, which means you can see your actual hands, can reach for items nearby, and can use different control schemes. It also makes you easier to get ahold of if someone needs your attention; they can just pop their head into your field of vision and talk with you.

The one downside that I can imagine, though, is that you’ll need to create a physical theater around yourself like the guy in this video has with the greenscreen. You’ll need at least a whole corner of the room, learn to hang a greenscreen and light it appropriately, and then learn to work with chroma replacement techniques to get your video to properly “project” onto the screen. I’ve tried using a greenscreen with normal video, and it’s as much an art as it is a science.

One thing I want to keep in mind is that the video above is perfect for seeing what the user is seeing, or at least what I think the user is seeing. With VR goggles, we external observers are limited to the dual fish-eye view, or a normal, flat representation as we watch the action on a standard monitor. We the observer don’t get the proper VR experience without the VR goggles, and a lot of the excitement of what we’re seeing is lost because of our “normal” experience. At least with the setup shown above, the AR experience translates well for external observers watching on another monitor. We see pretty much what we’d expect to see if we were in the user’s place. That makes me wonder if the video above was rendered specifically for the external viewer, or if that is exactly what the user sees. AR examples that we’ve been shown thus far have mostly been about CGI overlays in the “real world”, like robots busting through walls, or holographic Mincraft examples on a coffee table, and since I’ve not had the opportunity to use either VR or AR solutions, I’m not entirely sure what the limits are of an AR headset like Hololens.

Read More »