Friday, May 24, 2013

Force PowerShell scripts to run from C:\scripts

Here is a little tidbit I came up with because I like to be picky about how my scripts run.  It forces a script to relocate itself and run from the C:\Scripts folder.  I usually prefer to place the comments that describe a line on the end of that line.  In this case I had to go below the line.


$ErrorActionPreference = "SilentlyContinue"
$rand = New-Object System.Random

#--------------------[ Create and relocate to C:\scripts ]----------------------
if (!(Test-Path -pathtype Container "C:\Scripts")){
   new-item C:\Scripts -itemType Directory -Force  #--[ Create local scripts folder.
}

$TargetPath = "C:\Scripts"                              #--[ Where the script "should" live
$ScriptFullName = ($MyInvocation.MyCommand).Name        #--[ EX: script.ps1
$ScriptFullPath = ($MyInvocation.MyCommand).Path        #--[ EX: C:\Scripts\script.ps1
$ScriptHomeDir = split-path -parent $ScriptFullPath     #--[ EX: C:\Scripts, where the script resides
$ScriptWorkingDir = $pwd.path                           #--[ EX: C:\Scripts or C:\temp, where the script executes from
$ScriptShortName = [system.io.path]::GetFilenameWithoutExtension($ScriptFullPath)     #--[ EX: script (no extention)
$ErrorDetected = $False
$osv = [environment]::osversion.VersionString           #--[ Get Windows version, not required, just for convenience
$windir = [System.Environment]::ExpandEnvironmentVariables("%WINDIR%")     #--[ Get %windir% environment variable
#$windir = $env:windir                                  #--[ Alternate format
$ThisComputer1 = [System.Net.Dns]::GetHostName()        #--[ NOTE: No reason for both forms, just because.
$ThisComputer2 = $env:COMPUTERNAME
$eMailDomain = "gmail.com"                              #--[ Domain where emails will be sent
$ErrorDetected = $false

#--------------------[ Assure things run from C:\scripts ]----------------------
if (!($ScriptHomeDir -eq $TargetPath)){                 #--[ Paths are NOT same ]--
   if (!(Test-Path -path $TargetPath)){                 #--[ Does target path exist?  If not, create it ]--
      New-Item $TargetPath -type directory
   }
   if (!(Test-Path "$TargetPath\ThisScriptName")){      #--[ Is the script there?  If not, copy it there ]--
      Copy-Item $ScriptFullPath $TargetPath\$ScriptFullName
   }
   $WshShell = New-Object -comObject WScript.Shell      #--[ Place a shortcut on the desktop of current user to call the script
   $Shortcut = $WshShell.CreateShortcut("$Home\Desktop\MyScript.lnk")
   $Shortcut.TargetPath = 'powershell.exe'
   $Shortcut.Arguments = "-WindowStyle Hidden –Noninteractive -NoLogo -Command `"$TargetPath\$ScriptFullName`""
   $Shortcut.Arguments =
   $Shortcut.Save()

   #iex (.{Join-Path $TargetPath \$ScriptFullName})
      #--[ re-invoke the script from the new location ]--
   #EXIT    
      #--[ Force termination ]--
}Else{        
      #--[ Paths ARE good, script exists, OK to execute ]--
   #if (!(Test-Path "$ScriptPath\ThisScriptName"))        
      #--[ an optional additional check ]--
}
 

Tuesday, May 14, 2013

Mommy my desktop icons keep disappearing...

Well it's been a while.  I've since gotten a great new job and am back on my feet.  Busy as all get-out but that's a good thing.

I thought I'd post this since even though it was easy to find I had never heard of it before and I expect others will be looking for info on it as well.


I've been investigating a bizarre issue with desktop icons that's been plaguing users here at the new job. Here is what I found as well as an associated fix.


This is the response to a question posed about desktop shortcuts mysteriously disappearing:
This is a well-known problem, which as it turns out is actually a kind of twisted "feature" of Windows 7. 

Basically, there's a script that Windows 7 runs that regularly checks your desktop shortcuts and if it finds more than 4 of them "broken" (i.e., pointing to something that's not available at the moment), it removes all the "broken" ones!

I guess Microsoft feels somehow responsible for helping you keep your desktop clean. Perhaps this is a Good Thing for the non-technical people in their commercials who think up things in the back of a taxi.
This is the script that governs this activity:
C:\Windows\Diagnostics\Scheduled\Maintenance\TS_BrokenShortcuts.ps1

If you look in it, not too far from the end is a statement that compares the length of the list of broken shortcuts to 4. It's the only occurrence of the numeral 4 in the entire script.  If you replace the 4 with a very large number, your problem should be solved.

There is also a Support Article and hot-fix from Microsoft that claims to fix the issue here: https://support.microsoft.com/kb/2642357
Here are scenarios that will trigger this:
  • You create five or more shortcuts on the desktop of a computer that is running Windows 7.
  • These shortcuts are pointed to an external location. For example, the shortcuts are pointed to network resources or to removable storage devices.
  • The computer is disconnected from the network where the network resources reside. Or, the removable devices are disconnected from the computer.
  • You run the System Maintenance troubleshooter on the computer. (this also runs on a schedule)
The third item means ANY disconnect. That means network congestion, server overload, IPSEC issues, firewall issues, issues with the local PC that cause it to pause (CPU over loaded or RAM overloaded).  If you see the dreaded red X on your mapped drive that would indicate a disconnect.

The easiest way to fix this problem is go to Control Panel; Action Center; Troubleshooting; click Change Settings on the left hand side; then turn Computer Maintenance off.

I found out that to edit the file (which is protected) you first need to take ownership of it (or the folder), then reset the permissions of the user/group you just gave ownership to to "full control".  Once that's done you can COPY the file and paste a new copy, then delete the original and rename the copy using the old name.  At this point you can edit it.  I set the offending count to 499 and (so far) so good. 

Sunday, December 23, 2012

WRT54 firmware woes

I've been running DD-WRT on my LinkSys WRT54G routers for years.  It's always served me well.  I recently found out that if you want to make the new Windows 7 homegroups work you MUST have IPv6 connectivity on your LAN.  That does not work out of the box with DD-WRT.  It doesn't help that there hasn't been a new build of DD-WRT for a few years.  DD-WRT has IPv6 options but they are clunky and require additional scripting to work properly.

So, I started looking for alternatives.  I had heard of Tomato firmware before but never tried it.  Seems the "Shibby" version has most of the features I wanted so I grabbed a copy.  I must say I am truly impressed.  This firmware has so many sweet features it almost blew me away.  Go to the site and check the list.  The really nice thing is that IPv6 support just works out of the box.

The one issue I had was in flashing to the new firmware.  I would up bricking my WRT54GS v3.  Normally that's not an issue and I can recover it pretty easy.  This time the boot loader crashed hard and it was looking like I was headed to a JTAG cable to clear it.  I found this nice guide "The WRT54G Revival Guide"  that allows you to bypass the JTAG option.  By using the methods here I was able to reset the flash on the router and it cam back running the Shibby Tomato firmware I had loaded.

One gotcha with Tomato when upgrading from DD-WRT:

  • The GUI username is "admin" or "root" (username is required), ssh and telnet username is always "root", and the default password is "admin".
  • By default, the SES (aka AOSS, EZ-Setup) button is programmed to start a password-less telnet daemon at port 233 if held for 20+ seconds. If you run into a problem of not being able to login, you can use this to view or reset the password ("nvram get http_passwd" and "nvram set http_passwd=newpassword"). You can disable this behavior in Admin/Buttons.

This is from the Tomato read me file.

So everything is good.....for now.  Now I just need to get the remote units upgraded and the homegroup should start working.

 

Wednesday, December 12, 2012

Life is full of wayward wrenches...

and sometimes you get in ones way...

Just last week I was happily working on servers, decommissioning old equipment, writing scripts, and doing those things that a system admin does.  Then, wham, I get laid off.  The company is being reorganized and has to eliminate  1/3 of the staff.  A bit cliche I must say, to be laid off just before Christmas, but it is what it is.  So, here I am sending out resumes.  Oh well, life goes on.

Thursday, November 29, 2012

Upgrading Ubuntu 8.10 (Intrepid) to 10.04 (Jaunty).

I just ran into an issue where I had an older Ubuntu box running 8.10 which is not an LTS release and wanted to upgrade it.  Since it's no longer supported even doing simple updates wasn't working.  After searching the web a while I found a number of answers that combined allowed me to upgrade.

I found I was getting these errors when doing the "do-release-upgrade"
  Checking for a new ubuntu release 
  Failed Upgrade tool signature 
  Failed Upgrade tool  
  Done downloading extracting 'jaunty.tar.gz' 
  Failed to extract Extracting the upgrade failed. There may be a problem with the network or with the server.


The following information is from numerous sources.  It worked for me, I cannot guarantee it will work for you.  Try not to do the upgrade over SSH as problems can occur.

Note: Ubuntu has an upgrade guide that goes over upgrading older versions here: https://help.ubuntu.com/community/EOLUpgrades?action=show&redirect=IntrepidUpgrades


Add the following to /etc/apt/sources.list and comment out the old entries:

deb http://old-releases.ubuntu.com/ubuntu/ intrepid main restricted universe multiverse
deb http://old-releases.ubuntu.com/ubuntu/ intrepid-updates main restricted universe multiverse
deb http://old-releases.ubuntu.com/ubuntu/ intrepid-security main restricted universe multiverse
# Optional
deb http://old-releases.ubuntu.com/ubuntu/ intrepid-backports main restricted universe multiverse

deb http://old-releases.ubuntu.com/ubuntu/ intrepid-proposed main restricted universe multiverse

Update with the new repositories, and upgrade packages against the intrepid repository:

run 
 "sudo apt-get update && sudo apt-get dist-upgrade"
 "sudo apt-get install update-manager-core"

Fix the meta-release file:
Copy the old file from http://changelogs.ubuntu.com/meta-release to a local file to /etc/meta-release.rvg.   Modify /etc/meta-release.rvg so that "archive" is replaced by "old-releases"

Modify /etc/update-manager/meta-release so that it points to the local file rather than the
incorrect URI on the ubuntu site.

[METARELEASE]
URI = file:///etc/meta-release.rvg
URI_LTS = http://changelogs.ubuntu.com/meta-release-lts
URI_UNSTABLE_POSTFIX = -development
URI_PROPOSED_POSTFIX = -proposed

run
  "do-release-upgrade"

Reboot when prompted. 

run
 "lsb_release -a" to check the Ubuntu version. 
 
After this the system upgraded to 9.04 just fine.

The next upgrade from 9.04 to 9.10 was  just as simple.


Fix the meta-release file again:
Copy the file from http://changelogs.ubuntu.com/meta-release to a local file to /etc/meta-release.rvg over-writing the existing one.  This time edit out all versions listed after Karmic and edit the line "Supported: 0" by replacing 0 by 1 for the Karmic entry only.
Again, edit the  /etc/apt/sources.list and comment out the old entries.  Add these:

deb http://old-releases.ubuntu.com/ubuntu/ jaunty main restricted universe multiverse
deb http://old-releases.ubuntu.com/ubuntu/ jaunty-updates main restricted universe multiverse
deb http://old-releases.ubuntu.com/ubuntu/ jaunty-security main restricted universe multiverse
# Optional
deb http://old-releases.ubuntu.com/ubuntu/ jaunty-backports main restricted universe multiverse

deb http://old-releases.ubuntu.com/ubuntu/ jaunty-proposed main restricted universe multiverse

Update with the new repositories, and upgrade packages against the intrepid repository:

Run
  "sudo aptitude update && sudo aptitude safe-upgrade"
 
Make sure the  /etc/update-manager/meta-release still points to the local file rather than the
incorrect URI on the ubuntu site as previously noted.

Run
  "do-release-upgrade"

Reboot when prompted.
 
Run
 "lsb_release -a" to check the Ubuntu version. You should now see 9.10 Karmic. 

Lastly we upgrade to 10.04.



Fix the meta-release file again:
Copy the file from http://changelogs.ubuntu.com/meta-release to a local file to /etc/meta-release.rvg over-writing the existing one.  This time edit out all versions listed after Lucid and edit the line "Supported: 0" by replacing 0 by 1 for the Lucid entry only.
Edite the  /etc/update-manager/meta-release so that it points to the original URI on the ubuntu site.


Again, edit the  /etc/apt/sources.list and comment out the old entries.  Add these:

deb http://old-releases.ubuntu.com/ubuntu/ karmic main restricted universe multiverse
deb http://old-releases.ubuntu.com/ubuntu/ karmic-updates main restricted universe multiverse
deb http://old-releases.ubuntu.com/ubuntu/ karmic-security main restricted universe multiverse
# Optional
deb http://old-releases.ubuntu.com/ubuntu/ karmic-backports main restricted universe multiverse

deb http://old-releases.ubuntu.com/ubuntu/ karmic-proposed main restricted universe multiverse

Update with the new repositories, and upgrade packages against the intrepid repository:

Run
  "sudo aptitude update && sudo aptitude safe-upgrade"
 
Run
  "do-release-upgrade -d"  (my system needed the -d or it would'nt update)

Reboot when prompted. The new logon prompt will show "Ubuntu 10.04.4 LTS"
 
You can also run...
 "lsb_release -a" to check the Ubuntu version. You should now see 10.04 LTS Lucid.
  

Thursday, August 23, 2012

In the beginning....

This is a very late entry into the blogosphere.  I've never had much in the way of time to devote to this sort of thing so I've never tried it.  I figure it's time to put down my thoughts and any pertinent references I need quick access to.  Perhaps it might help a few others out along the way.  We'll see how it goes.

In case you're wondering about the name of the blog, I consider myself a "gearhead".  It's more of an insider term that summarizes "geek", "nerd", "techie", etc. all in one.  I've never liked the term "geek".  That's a carnival performer who eats bugs.  Similarly I've never liked "nerd"  since it has a very specific connotation associated with it. "Gearhead" has always seemed more appropriate so the title seemed appropriate.