Hi daveengle,
Thanks for posting your question. I'm more than happy to assist you today. There's two methods to keeping spiders from crawling your website. The first, involves creating a robots.txt file. This file will be read first, and specifies whether the bot can crawl the site. To learn more about robots.txt files please visit the site:
www.robotstxt.org
The second option is to place a specific "no index, no follow" tag in the header of each webpage you wish to exclude from being crawled. Here's the basic tag to do so:
‹META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
Also, I'd suggest creating a Google Webmaster Tools account. This will give you the ability to decide how Google's bots crawl your website.
I hope this helps! If you need further assistance please feel free to contact us.
Thank you!
Tim S