Mark Minasi's Windows Networking Tech Page
Issue #47 Late April 2005

To subscribe, visit To unsubscribe, link to To change e-mail address, switch between HTML or text format, etc., link to  Visit the Archives at  Please do NOT reply to this mail; for comments, please link to  Document copyright 2005 Mark Minasi.

What's Inside

  • News:
    • LAST 2005 Active Directory and Security Seminars Coming To Dallas In May, Chicago and Philly in June
    • New:  a complete list of all Tech Page articles since 1999!
  • Tech Section
    • Renaming Someone's 2003 Exchange Mailbox:  The Final Chapter
    • Did Installing An Anti-Spyware Program Kill Your IP Networking?
    • Introducing Log Parser, a Tool You Must Learn
  • Conferences
  • Bring a Seminar to Your Site


This month, the newsletters get a table of contents (yay!) and we offer the final chapter in our story about AD names, as a reader cracks the "I changed my Exchange name but Outlook took 12 hours to recognize the change" mystery, then I pass along a tip I discovered that's let me fix a number of IP stacks broken by anti-spyware tools.  But the thing I think you'll like will be my step-by-step look at Log Parser, a fantastic free tool for doing analysis, statistics and reports on your IIS logs, your Active Directory, your ... well, trust me, this thing does a lot.

But first, a word from our sponsor...

The Last 2005 Seminars are Coming to Philly, Chicago, and Dallas Soon!

I have to buckle down and get some work done on writing so the May and June seminars in Dallas (week of May 9), Chicago (latter part of the week of June 13) and Philadelphia (week of June 20) are the only ones for the rest of the year.  Please consider joining me at one...

Seminar: Securing Your Windows Desktops and Servers

Everyone wants to secure their network, but many don’t feel that they have the time. I find that a lot of people have a general idea about what they should be doing to secure their networks -- they've heard terms like SMB signing, null session, secure channel, LM hash, and so on -- but haven't the time to sift through the often-contradictory knowledge base articles and the welter of group policy settings, Registry hacks, patches and the like. In just one day, I go through what the big security issues mean and help attendees understand the exact step by step methods that you need to know to make your system more secure.

If you'd like to find out more, please visit

Running a 2003/2000-Based Active Directory

AD's great, but it can be a fragile flower if not built and maintained properly. Find out how to build, implement, maintain, and repair Active Directory at "Running a 2003/2000-Based Active Directory;" information at .

New:  a Complete List of All Tech Page Articles Since 1999

I know that sometimes it's hard to find an old article in the Archives at, so now we've got a list of major articles in every newsletter since December 1999's premier issue at  But don't thank me -- it's a gift to all of you from my sweetie Rhonda, she did all the hard work.  I hope you find it useful!

Tech Section

Renaming An Exchange 2003 Mailbox: The Final Chapter

You see, I knew that I shouldn't answer Exchange questions...<g>

Two months ago, I talked about how to solve problems changing Active Directory names.  Many of you wrote in asking how to change Exchange e-mail box names and I did a bit of research and wrote about it last month.  But I commented that for some reason it took about 12 hours for changes to kick in.  Some of you were kind enough to write in and conjecture (correctly) that I have Outlook 2003 in caching mode and, as I travel, I did, of course. 

Reader Michael B. Smith, an Exchange MVP, added these details:

Exchange updates offline address books every night and stores the results in public folders. (There are both incremental and full entries in the public folders, but lets not complicate the situation....and you can change when Offline Address Book (OAB) generation executes)

Cached mode in Outlook downloads the most recent entries in those public folders, if necessary, each morning and then updates the address book that is used in cached mode.  The primary reason for this is so that the address book is available when you are offline/without a network, and to improve your Outlook experience when you are on a slow connection.

Outlook Web Access (OWA) doesn't do this. In most cases, the OWA backend actually generates LDAP queries against AD to satisfy the information request.

This is the short answer. It's actually a somewhat elegant and very involved process, that is, in some places, quite kludgy.

So basically Exchange re-generates the address book once a day, and Outlook knows to look for it when started up.  But what if you wanted to generate the address book more often?  Michael explained that as well:

First, ask Exchange to rebuild the address book. (This is normally a scheduled process that happens at 4:00 am, by default.) To do this, open Exchange's Service Manager, expand Recipients, and click on Offline Address Lists. In the right pane, right-click on the offline address list of interest (you probably only have one - the Default Offline Address List) and select Rebuild. You will be warned that it potentially could take a long time. Click Yes to proceed with the rebuild. Wait a couple of minutes (Exchange's scheduler runs on a one minute clock, so it could be as much as 60 seconds before the process begins. After it starts, in a small domain, it'll probably only take a few seconds to rebuild the address list). Now, you are done with Exchange.

 Second, move to Outlook and ask it to download the updated address book. On the main toolbar, click on the down-arrow next to Send/Receive. In the menu, click Download Address Book. On the resulting dialog, click OK.  You are done with Outlook.

 I have intermittently had issues with Outlook that required me to restart Outlook before the changes appeared, but that isn't necessary normally.

 Just FYI: Outlook will normally check on the availability of an updated OAB 24 hours after it downloaded the last one. Or about five minutes after Outlook is restarted. So, after a rebuild, you can restart Outlook; and in about 5 minutes it should automagically download the update. But that requires waiting and I don't like waiting. :-)

That explains the mystery of the 12 hour delay, and thus ends a thread that was looking like it'd live longer than a George Lucas epic.  Several of you offered suggestions, and I sincerely thank all of you (and Michael)!

Did Installing An Anti-Spyware Program Kill Your IP Networking?

Sometimes the cure is worse than the disease.

A good friend installed Microsoft's Anti-Spyware beta at my recommendation.  (I can't stress how much you need this or some other anti-spyware tool.  I've liked ADAware as well.)  Anyway, he had a terrible reaction to the anti-spyware tool -- his IP stack no longer worked.  I was puzzled until I ran across this feature of XP's SP2 and 2003's SP1.  Just type this command:

netsh winsock reset catalog

As it turns out, many things insert themselves between your IP stack and the rest of your system.  Most are benign or outright helpful.  But because something that sits between the rest of the OS and the IP stack may not be the ONLY thing between the OS and the IP stack, these in-between programs must play well with one another -- these in-between programs all sit in a line and must be aware of who's ahead of them and who's behind them in line.  that way, if one in-between program exits, it knows to link the program behind it in line to the one in front of it in line.  (This "line" is the "catalog" in the above command.)  But some poorly written browser helper objects (one type of these in-between programs often used in spyware) aren't good at cleaning up after themselves and, when those kinds of browser helper objects are removed by anti-spyware tools, then the line of "in-between programs" may get messed up, with the result that you can't do anything on IP.

In any case, the bottom line is that if installing some kind of anti-spyware tool has made it impossible for you to network, open up a command line and try typing "netsh winsock reset catalog."

Introducing Log Parser, a Tool You Must Learn

I'm too cheap to pay for one of those Web site analysis tools, but I'd like to be able to extract a few statistics from my Web logs -- in particular, it'd be great to know how many hits a particular page had, or how many of you took a moment and read this newsletter.  When I asked my friend and IIS expert Brett Hill, he got this mystical look in his eyes -- you know the way people look when they're about to tell you about the Secrets Of The Universe that they've recently discovered? -- and beatifically intoned, "Log Parser."

Now, I'd already heard about Log Parser, but I'd also heard that it was a [fill in your favorite frustration-related adjective] nightmare to understand syntax-wise.  Brett said no problem, he was going to do a talk about Log Parser at the next Windows Connections conference.  But Brett got hired away by Microsoft -- he's now an IIS 7.0 Evangelist and yes, I did mean to type "7.0" rather than "6.0" -- and so Randy Franklin Smith, a big-time security techie, stepped in.  In his presentation, Randy did just what I needed him to do:  give me a bunch of already-working examples of Log Parser syntax so I could get started.  So I've been working with it and in this article, I'll explain why you really want to learn it and then I'll explain some of its truly nightmarish syntax.

I strongly recommend that you give this article a look.  This is a very useful tool and, of course, the price is right.

What Log Parser Can Work On

Log Parser is a free command-line tool that can analyze and report on a variety of files.  I've already suggested one use, to count the number of times that this newsletter has been viewed.  But Log Parser can also analyze event logs, your Active Directory ("show me all of the managers -- that is, someone who appears in the 'manager' attribute of my user accounts -- and compute how many people each person manages"), the file system ("show me the names of the ten largest files on the hard disk"), any text file ("how many times does the word "really" appear in this document?"), Network Monitor output, the Registry ("how many REG_DWORD values exist in my Registry?") and a number of other formats.  It will then output that data as text, a file, a datagrid, new entries in a SQL database, SYSLOG entries, XML, and so on.

Installing Log Parser and Some Sample Data

You can find Log Parser at Microsoft's downloads section.  It's a simple MSI file and so a snap to install.  Unfortunately it installs itself to a folder in Program Files without modifying the system's PATH environment variable, meaning that you've got to be in Log Parser's directory to run it or you'll get a "bad command or file name" error.  Either add Log Parser's directory to your PATH variable, or if you're lazy like me then just copy the logparser.exe file into System32.  Then open up a command line and you're ready to start parsing.

But we'll need some data to analyze.  I've simplified (and sanitized) two day's logs from my Web site and put them at

Right-click that URL from Internet Explorer and choose "Save target as..." to save the file to your hard disk.  Unzip it and you'll find two log files -- put them in a directory named c:\logfiles.  Make that your default directory ("cd \logfiles") in your command prompt window and your commands can be shorter, as you won't have to type "c:\logfiles" in your Log Parser commands.  With that, we're ready to go.

Note:  here's another strong recommendation:  grab that ZIP file, download Log Parser and try this stuff out.  If you're feeling lazy, remember you can always just copy a Log Parser line from the Web page you're reading and paste it into your command prompt.

A First Query

Let's try out just about the simplest Log Parser command possible:

logparser "select * from *.log" -i:iisw3c

Picked apart, it is the command "logparser," followed by a SQL query statement -- don't run away, I'll show you all the SQL you'll need today! -- followed by the -i option, which explains to Log Parser what kind of file it is (an IIS log file, in this case).  The SQL query is "Select * from *.log", which just means "get everything" -- the asterisk works in SQL the same as it does in DOS commands, meaning "everything" -- from all of the files with the extension "log" in the current directory. 

(Aside: this is why learning Log Parser is difficult -- you're trying to learn two new things at the same time.  Half of what you're trying to learn is Log Parser's syntax, which is ugly enough all by itself.  But every Log Parser query includes a SQL query, and if you've never written SQL queries then you'll find that they're a quite wide field of syntax to master as well.  I strongly recommend taking the time to browse through the logparser.chm Help file that installs in the same directory as Log Parser.  And let me note at this point that I'm not a SQL query expert, so I may not be approaching these problems in the best way.)

Controlling Where Log Parser Puts Its Output: rtp, FROM and Data Grids

You'll get ten lines of output and then a "press a key..." prompt.  Given that there are about 6491 lines in the two logs and assuming that you want to see every line, that'd mean you'd have to press a key about 649 times... yuck.  That's where the -rtp ("records to print?") option comes in; set it to 100 and it'll only pause every 100 lines.  Set it to -1 and you'll never see a "press a key..." again:

logparser "select * from *.log" -i:iisw3c -rtp:-1

Of course, that still takes a long time and is kinda useless in a command prompt window.  We can tell Log Parser to stuff the result into a file by adding the "INTO" option.  It goes in the SQL query before the FROM part.  This takes the output and puts it in a file called OUTPUT.TXT:

logparser "select * into output.txt from *.log" -i:iisw3c -rtp:-1

Open output.txt in Notepad and you'll see that you've got all of the info from the two logs nicely collected in output.txt.  But Log Parser can output its data in other ways as well.  In particular, it can use a built-in thing that 2000, XP and 2003 contain called a "data grid."  We tell Log Parser to output to something other than its "native" format (dumping all of the junk onto the screen) with the -o: option:

logparser "select * from *.log" -i:iisw3c -rtp:-1 -o:datagrid

I guess I shouldn't be surprised given that Windows is 40 or 50 million lines of code these days, but it's always interesting to learn that there's something "new" that I already owned.  Notice the "Auto Resize" button -- click it and the columns figure out how wide they should be; very nice.

Seeing What's In an Input File

Notice what's going on here -- Log Parser used spaces, tabs or commas to separate -- "parse" is the correct phrase -- each line into particular items.  You then see in the column headers the names of those items.  For example, the IP address of the person visiting my Web site is in the c-ip field, and the file that they viewed is in the cs-uri-stem field.  Or alternatively you can ask Log Parser about the "iisw3c" format like so:

logparser -h -i:iisw3c

But what we've seen so far is really just a straight dump of the logs, no computation or analysis.  What if I just wanted to see the IP addresses of my visitors?  I'd do that by restricting the things that I SELECT:

logparser "select c-ip from c:\logfiles\*.log" -o:datagrid -i:iisw3c -rtp:-1

Doing a Little Analysis and Beautifying:  COUNT, GROUP BY and AS

Ah, a much smaller amount of data, but again no analysis.  It'd be more interesting to see how often each one visited.  Again, I modify the SQL SELECT statement. I can create a new field that reports the number of times that a given IP address appears by adding a "count(*)" variable to the SELECT statement.  COUNT does just what you'd expect it to do -- it counts records.  A super-simple example might be (note that this doesn't work, I'm just introducing the idea):

logparser "select c-ip, count(*) from *.log" -i:iisw3c -o:datagrid -rtp:-1

Now, if this did work, it'd list two columns -- each IP address and how often that IP address appears.  If something showed up 30 times you'd see it 30 times with a count of 30 next to it each time.  But, again, this doesn't work, and Log Parser says that it wants a "group by" clause.   So let's do it logparser's way and add a GROUP BY clause.

logparser "select c-ip, count(*) from *.log group by c-ip" -i:iisw3c -o:datagrid -rtp:-1

That works, and we get about 1200 entries instead of almost 6500, as the duplicates are gone.  But the data grid's column label for the c-ip count is "COUNT(ALL *)," which is not all that meaningful.  That count is the number of visits that a particular IP address made, so we'd like Log Parser to call that column something more meaningful, like for example, oh, "visits."  I can do that by adding an AS clause to the list of selected items:

logparser "select c-ip, count(*) as visits from *.log group by c-ip" -i:iisw3c -o:datagrid -rtp:-1

Sorting and Shortening:  ORDER BY, TOP, DESC and HAVING

That query's output has a better column title, but the list shows the IP addresses in no particular order. It'd be nice to have it sorted by frequency, so we add the "ORDER BY" clause (this should be typed as one line although I've  broken it up so that your browser doesn't make you scroll left and right):

logparser "select c-ip, count(*) as visits from *.log group by c-ip 
order by visits" -i:iisw3c -o:datagrid -rtp:-1

Neat; now it's easy to see that one IP address visited over 140 times.   But there are an awful lot of one-visit IP addresses; can we just see the top five visitors?  Sure, with the TOP clause (again, type as one line even though I've broken it for a happier browser experience):

logparser "select top 5 c-ip, count(*) as visits from *.log group by c-ip 
order by visits" -i:iisw3c -o:datagrid -rtp:-1

Hmmm... that showed me the "top" five, all right, but the "top" of the data grid is the low numbers, as it starts out with the one-visit IPs and ends up with the multi-time visitors at the bottom of the data grid.  How to see the bottom five?  Well, there isn't a BOTTOM clause, but we can tell it to sort descending rather than its default, ascending, by adding the DESC clause to ORDER BY.  Then the most frequently-visiting IP addresses end up at the top of the results and TOP 5 works as we'd hoped (type as one line):

logparser "select top 5 c-ip, count(*) as visits from *.log group by c-ip 
order by visits desc" -i:iisw3c -o:datagrid -rtp:-1

Or alternatively I might just want to see all of the IP addresses that have visited me more than 50 times. I can do that with the HAVING clause of the SQL query (type as one line):

logparser "select c-ip, count(*) as visits from *.log group by c-ip 
having count(*) >50 order by visits desc" -i:iisw3c -o:datagrid -rtp:-1

Notice that in that case I removed the "top 5" and added "having count(*) > 50" after the "group by."  Log Parser will squawk at you if you put them in a different order.  In SQL queries want to see their clauses in a particular order -- first the SELECT, then the things you're selecting, then the FROM, then the HAVING, then the ORDER BY.  If you're using an INTO, it goes after the things you're selecting and before the the FROM.  So for example if I wanted the output of my query to go to a text file named out2.txt, I'd type (yes, again please type as one line although it's broken):

logparser "select c-ip, count(*) as visits into out2.txt from *.log 
group by c-ip having count(*) >50 order by visits desc" -i:iisw3c -rtp:-1

Side Trip:  More Useful SQL Syntax: WHERE, LIKE and More

As long as I'm talking about the format of the SQL SELECT statement, let's take a short side-trip from questing for the perfect query (remember, I'm trying to figure out how many people saw thismonth.htm) and look at what the SELECT statement can do in a bit more detail. 

If you have a really long, ugly SQL query then you can put it in a text file and refer to it.  For example, consider that last query -- it had a huge SELECT statement.  I could type its SQL part, "select c-ip, count(c-ip) as visits from *.log group by c-ip having count(*) >50 order by visits desc" in a text file like so:

select top 5
count(*) as visits
from *.log
group by c-ip
having count(*) > 50
order by visits desc

I then save that in a file I'll call myquery.sql (but I could call it anything).  Then this command gets the job done:

logparser file:myquery.sql -i:iisw3c -o:datagrid -rtp:-1

But I've left out a couple more types of clauses that you can put into a Log Parser SQL query.    There are also the WHERE and USING clauses.  (By the way, Log Parser only supports a subset of honest-to-God SQL.  Thank goodness.  And USING seems not to be a standard SQL clause.)  The order of these clauses in a Select statement is:

  • INTO
  • FROM

And no, I haven't defined what USING does yet, that's kind of beyond the scope of this introductory article.  I just wanted to offer one place where I presented all of the SQL clauses in order for easy reference.  Let's take up WHERE next.

By saying "select c-ip, count(c-ip) from *.log...", I told Log Parser to get the c-ip data from every single line on all of the logs in my local directory, and then do its analysis on those records.  But sometimes I want to tell Log Parser not to fetch every single record, but instead a subset.  I do that with the WHERE clause.  Suppose (I know this is a stretch but I need an example that fits into our data set) I only want to see entries where the IP address starts with "194."  This WHERE clause will do that :

logparser "select top 5 c-ip, count(*) as hits from *.log where c-ip like '194.%.%.%' 
group by c-ip order by hits desc" -i:iisw3c -o:datagrid

This command includes the clause "where c-ip like '194.%.%.%' and "LIKE" means "matches a particular pattern."  Patterns can either be particular characters, like if I'd written "cs-ip like ','" or they can use the "_" and "%" wildcards.  "_" means "match exactly one character" and "%" means "match zero or more characters."  The pattern '194.%.%.%,' then -- notice that LIKE patterns are always surrounded by single quotes -- would match any IP address that started with 194, followed by a period, followed by anything (the %), followed by another period, followed by another percent, followed by a final period and a percent.  Here are a few more LIKE pattern examples:

Pattern Examples that would match
'Mark' Mark -- "mark" wouldn't do it, case matters
'Mark%' Mark Minasi, Mark77, Marky, Mark
'Mark_' Marky, MarkM; Neither Mark nor Mark Minasi would work
'Mar%k' Mark, Maraardvark
% any string at all, or even nothing

There is also a NOT LIKE command.

So we've seen that we can use WHERE to restrict the things that SELECT does its work on.  But doesn't HAVING do that as well?  Kind of, but not exactly.  WHERE restricts the data that SELECT looks at to do its analysis; HAVING restricts the results of that analysis.  If that's not clear, let's do another query that will make it clear, as well as giving me an excuse to do some more Log Parser examples.

Querying For the Most Popular Files

We've explored our most-visiting-IP-addresses, but recall that wasn't really what I wanted to do -- I wanted to see how often people viewed thismonth.htm.  Can you see how you'd change it so that we're not seeing the most frequently-visiting IP address, but instead to see the most-requested files?  A look at the data grid output shows that the name of a requested file -- default.asp, thismonth.htm, or the like is the field "cs-uri-stem."   Form the same query as before, but replace "c-ip" with "cs-uri-stem."  Additionally, "hits" is probably a better phrase than "visits" and "file-requested" is more meaningful to most than "cs-uri-stem" and so we end up with this query (again broken here but should be typed as one line):

logparser "select top 5 cs-uri-stem as requested-file, count(c-ip) as hits from *.log 
group by requested-file order by hits desc" -i:iisw3c -o:datagrid -rtp:-1

But let's recall that I had a particular query in mind when I got started -- how many hits did my newsletter file, thismonth.htm, get?  I could figure that out from the query that showed me the top five most-visited files, but that's kind of lame.  Instead, this query does it with a WHERE clause, reporting right to the screen (again, type as one line):

logparser "select cs-uri-stem as filename, count(*) as hits from *.log 
where filename='/thismonth.htm' group by filename" -i:iisw3c

Again, notice that WHERE clause.  SQL queries fetch some subset of the fields in a database (cs-uri-stem in this case, for example) -- Log Parser didn't grab every available field in the IIS logs, just cs-uri-stem, and so it had a smaller bunch of data to work on, which presumably would make the query run faster.  So naming particular fields in the SELECT statement instead of entering * to get all of the fields reduces the number of fields to fetch before doing some kind of analysis and reporting.  In contrast, using a WHERE clause reduces the number of records fetched.  Fewer records also means less work for the query engine, which means a faster query.

Now I can offer an example where WHERE does something similar to HAVING.  We could phrase the query this way (type as one line):

logparser "select cs-uri-stem as filename, count(*) as hits from *.log 
group by filename having filename='/thismonth.htm'" -i:iisw3c

In this second query, I told Log Parser to grab the cs-uri-stem data from every single record and do a bit of computation on it (count the frequency of each file).  Once it's done with that, then Log Parser's got the breakdown of file name frequencies for every single file every encountered.  Now, I don't want to see all of those file name frequencies, I just want the frequency for thismonth.htm.  That's what the HAVING clause does -- it says "Log Parser, you've got a huge hunk of data, but I only want you to show me a tiny bit of it." 

I figured that the first query, the one with the WHERE clause would be a bit more efficient as it says to only bother computing the hit count on records about thismonth.htm, where the second computed hit counts on every single file mentioned in the log, and then only showed thismonth.htm.  And my guess was borne out, as Log Parser reports how long it takes to do something.   And yeah, the time difference was about 0.1 second, but remember there's only two log files in our test bunch -- analyzing five and a half years' worth of logs might pay off in terms of a noticeable time difference with a more efficient query.  Of course, I could be wrong --  remember, I'm just an apprentice SQLer.  (Is that pronounced "squealer?"  I'm getting these creepy flashbacks to Ned Beatty in Deliverance for some reason.)

I should also mention that, again, I'm just scratching the surface here, but here's an even more efficient way to tally the thismonth.htms:

logparser "select count(*) from *.log where cs-uri-stem='/thismonth.htm'" -i:iisw3c

Controlling IIS Log Dates To Query From

Now, that's all pretty neat, except for one thing:  I've been running a Web site named for quite a long time.  But I've only been offering these free newsletters since 1999 and if I recall right I've only used the "thismonth.htm" file name for the past two.  In addition, I'm really only interested in how many people have looked at this in, say, the past month.  How, then, do I tell Log Parser "do that query, but for heaven's sake don't read every IIS log I've got going back to the beginning of time; instead, only look at entries since Friday, 29 April 2005 at 7:00 PM.

As it turns out the particular iisw3c input type has a special option designed to do just that, as I discovered by looking in the "IISW3C Input Format Parameters" page of Log Parser help.  Just add the -mindatemod parameter, followed by the earliest time that you want the log searched in yyyy-mm-dd format.  For example, to only see the "thismonth.htm" hits since 29 April 2005 I'd type (as one line)

logparser "select count(*) from *.log where cs-uri-stem='/thismonth.htm'" 
-i:iisw3c -mindatemod 2005-04-29

Or, to include the time as well, add time as hh:mm:ss and put quotes around the date/time combination.  To see all the hits since 7 PM on the 29th of April 2005, I'd type (as one line)

logparser "select count(*) from *.log where cs-uri-stem='/thismonth.htm'" -i:iisw3c 
-mindatemod "2005-04-29 19:00:00"

One little quirk to remember is that the log files store time in Greenwich/Universal/Zulu time.  So in my case, as I live in the Eastern time zone, I'd have to ask not for 7 PM (which is Eastern Daylight Time as I write this) but instead for 11 PM Universal, as I'm four hours behind that time zone.  In the winter I'm five hours behind UTC, so 7 PM for me would be the next day's midnight.  

Log Parser Output Fun

Whew, that SQL syntax stuff can be rough going, if useful.  Let's take a break and play with some more fun stuff -- a few of the ways that Log Parser can output the results of your queries.  We've seen the "native" (simple text) and data grid outputs.  Just for the sake of Log Parser weirdness, try changing the output format to "neuroview (type as one line):"

logparser "select top 5 c-ip, count(*) as visits from *.log group by c-ip 
order by visits desc" -i:iisw3c -o:neuroview -looprows:1

Which produces a pretty much useless output... but it looks like the credits from the Matrix movies so you'll no doubt impress the crowd at your next presentation.

If you've got Office loaded on your system, then you can do some graphing too.  Try creating a bar chart instead of a data grid (type as one line):

logparser "select top 5 c-ip, count(*) as visits into chart.gif from *.log 
group by c-ip order by visits desc" -i:iisw3c -o:chart -charttype:bar3d -view:on

Again, look in Log Parser's Help for more ways to display its query results.  You might sometimes want to do a really complex query that can't be done in one SQL statement; in that case, you'd do the query in parts, where you do the initial query and save those results to some kind of file (probably a comma separated variable or CSV file), then do a second query on that file, and so on.

Querying the File System with Log Parser

Let's try a query or two on something other than an IIS log.  Wondering how many MP3 files are on your system?  Well, we've seen all of the fields in iisw3c-type input files.  Here we'll use the "fs" input type file and we can get a listing of its fields with the logparser -h -i:fs approach, or look in the quite helpful Log Parser help file.  I find that, not surprisingly, there is an attribute called "name"

logparser "select count(*) from c:\* where name like '%.mp3'" -i:fs

You'll probably get a complaint that it couldn't complete the task but that's because it can't read the System Volume Information folder -- the results are still correct.  (If you like, you can give yourself Read permissions to the System Volume Information folder, but you'll get the same results either way.  Unfortunately there is no way that I know of to say to Log Parser, "search all of C:\ except for such-and-such directory.")  Or total up how much space they're taking:

logparser "select sum(size) from c:\* where name like '%.mp3'" -i:fs

Querying Active Directory

Here's a quick Log Parser AD example.  Suppose I want to get a list of all of the first names in the company, and how many people have each first name.  The only trick you've got to know is the AD word for "first names," which (if you read Newsletter 45) you know is "givenname."  Second, the "FROM" part looks like


So suppose I've got an administrative account called bigguy with password "swordfish" at and I want to search the whole domain for user names.  The query would look like this (and it's an ugly one, again you'd type as one line although I split it into three on the page):

logparser "select givenname, count(givenname) as tally 
from 'ldap://,dc=com 
group by givenname order by tally desc" -objclass:User -o:datagrid -rtp:1 -i:ADS

More Resources

This was just a start with Log Parser.  The help's got lots of examples that are worth working through, and of course you can Google it for more.  You'll also find more links about Log Parser at  Apparently Log Parser's even got a book written about it!


Join me at ...

TechMentor Fall 2005 (San Jose):  101 Communication's semi-annual geekfest returns to San Jose this October 17-21.  Follow for more info as it appears.

Windows Connections Fall 2005 (San Diego):  I just got back from my magazine's twice-per-year tech-o-rama  in mid-April, and it was one of the best ever, featuring great speakers and an amazing panoply of topics.  I'm not sure how California managed to snag three of the four best IT shows this year (Connections in San Francisco, TechMentor in San Jose, Connections in San Diego) but if you're a fan of the Golden State's weather then 2005's the conference-going year for you!  Information on Connections at  Our program chair Amy Eisenberg's trying to out-do herself so it'll be a great show.

Bring Mark to your site to teach

I'm keeping busy doing Active Directory and Security seminars and writing, but I've still got time to visit your firm.  In just two days, I'll make your current NT techies into 2000, XP, Active Directory and 2003 experts.  (And better yet they won't have to sit through any Redmondian propaganda.)  To join the large educational, pharmaceutical, agricultural, aerospace, utility, banking, government, telecommunication, law enforcement, publishing, transportation, and other organizations that I've assisted, either take a peek at the course outlines at, mail our assistant Jean Snead at, or call her at (757) 426-1431 (only between noon-5 Eastern time, weekdays, please).

Until Next Month...

Have a quiet and safe month. 

Please share this newsletter; I'd like very much to expand this periodical into a useful source of NT/2000/2003/XP information.  Please forward it to any associates who might find it helpful, and accept my thanks.  We are now at over 36,000 subscribers and I hope to use this to get information to every single Mastering 2003, XP, NT and 2000 Server reader. Thanks for letting me visit with you, and take care.  Many, many thanks to the readers who have mailed me to offer suggestions, errata, and those kind reviews.  As always, I'm at and please join us at the Forum with technical questions at

To subscribe, visit To change e-mail, format, etc., link to  To unsubscribe, link to Visit the Archives at Please do NOT reply to this mail; for comments, please link to

All contents copyright 2005 Mark Minasi. You are encouraged to quote this material, SO LONG as you include this entire document; thanks.