Diskeeper 11 vs Perfect Disk 8

Discussion in 'General Software Discussion' started by Chaos, Nov 6, 2006.

  1. PangingJr

    PangingJr Member

    Joined:
    Mar 14, 2003
    Messages:
    5,989
    Likes Received:
    56
    Trophy Points:
    0
    i just thought you may have some of the third party service names in the KB issues in the Diskeeper software,
    however, thanks for explaining that to this thread. i know you have helped answering this problem for other software product.
     
  2. Jeremy of Many

    Jeremy of Many New Member

    Joined:
    Aug 26, 2005
    Messages:
    42
    Likes Received:
    1
    Trophy Points:
    0
  3. PangingJr

    PangingJr Member

    Joined:
    Mar 14, 2003
    Messages:
    5,989
    Likes Received:
    56
    Trophy Points:
    0
    Jeremy of Many,
    read your post there, as you said "One of them has to be better."...in someone's mind.
    you seem to like the Diskeeper more than PerfectDisk, already. no?
    however, perhaps you could repost your questions here? so if you ever got any response from either of the experts...
    then we can read both questions and answers altogeter a bit easiler, more understandable and we all get the benefits of that.
     
  4. Jeremy of Many

    Jeremy of Many New Member

    Joined:
    Aug 26, 2005
    Messages:
    42
    Likes Received:
    1
    Trophy Points:
    0
    The simplicity of my understanding so far is:
    PerfectDisk will take more time and put a heavier workload on the harddrive by using the SmartPlacement method; placing all files together in sequential order from the beginning of the disk spanning outwards. One pass does usually leave 0 file fragments remaining, so it is deemed very efficient. However, one reboot later and you have fragmented files again, anyway! Here's a scenario: Let's say for example if svchost.exe was placing near the beginning of the disk, for quickest access. The user then goes to Windows Updates and that file is overwritten with a newer patched one. Does PD move all files out of the way and squeeze that file where the previously written one does, putting an extraordinary amount of work on the drive for one file, or does it just place it at the end of the line, defeating the purpose of the previous task of placing it near the beginning for quickest access? This is something I should actually ask here, since employees from DK and PD are replying there. It would be wonderful to get clarification as to what happens in this scenario.
    Diskeeper, while it doesn't do SmartPlacement, there is I-FAAST 2, which you can learn a lot about by reading Michael's (Project Manager of Diskeeper) blog entry. It does invisible (InvisiTasking) and completely Automated defragmentation in "Real-time", which you can understand thoroughly by reading another blog entry of Michael's. I previously used PD in the past when I truly believed it was superior in its efficiency. However, since DK2007 was released, I cannot be bothered to manually defrag my files. I would rather have DK manage my files in the background at no expense to my overall system performance than to have PD defrag them nicely during one session, let them fragment in 24 hours or two weeks (as users are frequently doing various tasks which result in varying levels of I/O (Read/Writes) then defragment them again.
    In the long run, I think Diskeeper is more beneficial to the lifespan/health of the harddrive, the reason well-explained in the second blog entry of Michael's I provided a link for above, or again here.

    Don't get me wrong, I'm not trying to be a fanboy here. If the experts at Raxco can prove in detail that PerfectDisk is better and healthier for a harddrive, and explain both the pros and cons of using PD8, than that would be the best thing for everyone here.
     
  5. gshayes

    gshayes New Member

    Joined:
    Nov 6, 2006
    Messages:
    4
    Likes Received:
    1
    Trophy Points:
    0
    There are several known 3rd party programs that have drivers that will prevent PerfectDisk's boot time defrag from running. The Raxco support site at http://www.raxco.com/support has detailed information about the programs that we are aware of and if the 3rd party developer has released. The PerfectDisk Support Info (Help/About PerfectDisk and then click on the Support Info button) looks for the ones that we are aware of and tells you if they are found.

    - Greg/Raxco Software
    Microsoft MVP - Windows File Systems

    Disclaimer: I work for Raxco Software, the maker of PerfectDisk - a commercial defrag utility, as a systems engineer in the support department.
     
  6. PangingJr

    PangingJr Member

    Joined:
    Mar 14, 2003
    Messages:
    5,989
    Likes Received:
    56
    Trophy Points:
    0
    Sorry that i asked you to quote your own messages before, i didn't know that was you who posted that.

    and thanks for the information on the Support Info button, this is new to me too.

    here is the name of third party programs...
    [​IMG]
     
  7. gshayes

    gshayes New Member

    Joined:
    Nov 6, 2006
    Messages:
    4
    Likes Received:
    1
    Trophy Points:
    0
    Quick question for you. What is the difference between scheduling a daily defrag pass - which will automatically defragment your drive everyday - or having something run continously every day? The answer? Not much - except that you have more control over defrag activity when you decide when/if it runs.

    Probably the biggest difference between Raxco's defragmentation strategy and the way that other defragmenters work is that PerfectDisk is designed to do the best job possible in a single pass - at the end of it's pass, the job is finished/done - you don't have to wonder if the drive is better or wait for who knows how many passes (or how long if running all of the time). PD is also designed for worse case drives - where available free space is low and fragmentation complexity is high. For years, PD has been the only single pass defragmenter out there. PD was designed from the beginning (based on our proven technology originally developed for performing disk optimization for OpenVMS systems) to improve drive performance in a single pass - meaning defragmentation of files and consolidation of free space - while being "friendly" with usage of system resources. PerfectDisk was the 1st defragmenter to effectively and effeciently defragment multi-TB drives (which even the workstation version of PerfectDisk can easily do - nothing special needed). We still remain the only defragmenter that not only tells you about all of the NTFS metadata but also defragments all of these "special" files. We remain the only defragmenter that defragments the hibernate file. We actually provide information on free space fragmentation (most defragmenters don't even though they talk about the importance of free space consolidation - even if they don't effectively do it) .

    Keep in mind that the ONLY reason that you defragment a drive is to improve drive performance. The best drive performance possible can only be achieved by improving read performance (it's faster to read a defragmented file than a fragmented file) AND improving write performannce by consolidating free space on the drive. PD focuses on BOTH of these aspects of drive performance - not just improving read performance (I don't know of too many Windows systems where most of the disk activity is simply read activity). Kinda of like putting low-test gas in a high performance race car. Sure, the car may run - but it doesn't "zoom" like you'd expect it to :)

    Raxco is a close Microsoft partner. Besides me being a MS MVP and having direct access to the file system group, our developers also have a very close relationship with the MS file system team - they are all PerfectDisk fans. Every version of PerfectDisk beginning with V4.0 has been certified by Microsoft and Raxco was on the team that helped Microsoft write the Vista certification requirements. Microsoft itself talks about the importance of consolidating free space as part of defragmenting. How important? The free space consolidation portion of the Vista defragmenter was completely re-written to do a more effective job of free space consolidation. A real good "read" is a test that was performed several years ago by one of the original developers of NTFS on the performance improvment that can be measured and seen by consolidation of free space (vs not consolidating free space) - which can be read at http://www.raxco.com/products/perfectdisk2k/whitepapers/FreeSpaceConsolidation.pdf

    As far as a file placement strategy, Smart Placement (patented) is primarily designed to accomplish 2 things: (1) Slow down the rate of re-fragmentation (remember, the file system is designed to fragment. You can't prevent it from happening but you can slow it down) and (2) Speed up future defrag passes. Heavier workload using Smart Placement? Possibly - but not always. Remember, Smart Placement is doing several things - defragmenting files, intelligently placing them AND consolidating free space. Compared to defragmenters that are doing just defrag or possibly defrag and some sort of file placement Smart Placement may take longer but ultimately, the drive performs better in the long run.

    Smart PlacSmart Placement is based on file modification date - NOT last access date. With Smart Placement, once an initial defrag pass has been performaned, future passes require less effort (both to defragment files and consolidate free space). There are some additional performance benefits that can be seen (improved boot speeds) due to where Smart Placement places boot files (at the "beginning" of the logical drive adjacent to the boot sector). However, Smart Placement doesn't attempt to try to get further improvement in drive performance by specifically placing files at a particular place on the drive where it is presumed that the drive is fastest (presumed but very difficult to accurately determine as the drive typically isn't idle for a long enough period of time to accurately detect - works great in the lab but not so well in a production environment). Typically, the "cost" to determine these files and shuffle them back and forth on an active drive where what is being accessed flucuates over time as the drive is used isn't worth the very minor performance improvement that might be gained. You get a far better improvement in drive performance from free space consolidation than specific file placement.

    Regarding PD8 specifically, couple of new things. Single file defrag (allows you to quickly defragment a single file) as well as a Consolidate Free Space defrag (defragments files and consolidates freespace - but doesn't Smart Place files). Our patent-pending Resource Saver technology also reduces the system resources required to defragment a file by eliminating some of the initial I/O usually required prior to actually defragmenting a file. As drives get larger and larger, being able to quickly and effectiently defragment huge drives is going to be very important. PD8 is designed for these types of drives.

    What's nice for defrag users is that people actually have a choice - there are several vendors - from home user "shareware" defragmenters to full featured enterprise enabled defragmenters like PerfectDisk and Diskeeper.

    Well, I don't know how many characters I'm allowed to post so I think I'd better quit now and see if it is accepted :)

    - Greg/Raxco Software
    Microsoft MVP - Windows File Systems

    Disclaimer: I work for Raxco Software, the maker of PerfectDisk - a commercial defrag utility, as a systems engineer in the support department.
     
    PangingJr likes this.
  8. mike2h

    mike2h New Member

    Joined:
    Nov 11, 2002
    Messages:
    6,359
    Likes Received:
    69
    Trophy Points:
    0
    just wanted to thank everybody involved for a very informative(& professional) thread.
    have been using pd since 6 on all 3 comps. very happy with it but have been considering trying dp just becuase i like to chx my options. always wondered what the main difference was, this thread has taken care of that.
     
  9. BWX

    BWX get out and ride

    Joined:
    Nov 29, 2002
    Messages:
    19,684
    Likes Received:
    63
    Trophy Points:
    73
    Thanks I found the ones that I have installed. I will try the work-arounds..

    http://www.raxco.com/support/windows/kb_details.cfm?kbid=537&issue=41



     
  10. Jeremy of Many

    Jeremy of Many New Member

    Joined:
    Aug 26, 2005
    Messages:
    42
    Likes Received:
    1
    Trophy Points:
    0
    Is that a represented-as-a-whole opinion of PerfectDisk, or just yours?
    A daily defrag pass, or automatic invisible defragmentation? Sure, a user can schedule a defrag at, for example, 5 PM because they intend on being AFK. Let's say their plan changes and they end up sitting down at the PC at 5:03 PM. They open up their web browser or video editing software, but then remember that PD8 is defragging their drive. It has already analyzed the drive, so what happens to a file they modify, delete, or create while the defrag pass is in session? Is it dynamically dealt with, does PD ignore it, what?
    My point is, they may be in control of when defrags take place, but I'm asking what happens to the files that are changed during a defrag pass?

    I realize from experience that PerfectDisk is very efficient in one pass. I've thought this through for a good portion of the day and I understand how, when combining PD's method of SmartPlacement and Free Space Consolidation are designed to prevent future fragmentation. However, one drawback (con) I can think of is, given the scenario I previously mentioned:
    Let's say for example if svchost.exe was placing near the beginning of the disk, for quickest access. The user then goes to Windows Updates and that file is overwritten with a newer patched one. Does PD move all files out of the way and squeeze that file where the previously written one does, putting an extraordinary amount of work on the drive for one file, or does it just place it at the end of the line, defeating the purpose of the previous task of placing it near the beginning for quickest access?
    Would you be so kind as to clear that up for us? :)

    Before quoting the following, keep in mind that I did not edit your words in anyway, I simply restructured it in list form to make it more easily readable:
    If Michael has anything to say about Diskeeper's lack of or similar ability to do some of that, that would be nice. :)

    As I stated in the beginning, I realize the importance of this. However, both PerfectDisk and Diskeeper do this. I've read the PDFs, but I question their legitimacy, especially when they are done by Raxco and not a 3rd party. Also, the one of FSC is outdated and doesn't address newer versions of either programs.
    The Impact of Free Space Consolidation On Windows File System Performance PDF and Michael's post really seem to oppose each other. Raxco says Diskeeper results in wasted seeks (I/Os), and although Michael's post isn't directed at Raxco, it does provide a good understanding and emphasize the importance of defragmenting a harddrive in order to expand the lifespan on a harddrive. So those two sources do, in a way, contradict one another.

    Hey, I prefer efficiency over speed any day. Thus far, we've established that Free Space Consolidation is essential, therefor without it, defragmenting alone doesn't fully amount to file performance. Michael, does DK2007 consolidate free space as well as automatically defragment?

    Roger that.
    No need to repeat yourself. :)
    Yes, I have experienced better boot-times with PD in the past.
    Isn't that exactly what SmartPlacement does? If going by frequency of usage and a file is recognised as not being used as often as it used to be, does PD not "shuffle" change it's color in the Disk Analysis and move it elsewhere on the next defrag? I've experienced this many times. You've really got me confused now, have my eyes deceived me? .....
    Isn't this difficult task done by Diskeeper's I-FAAST feature?

    Yes, I figured the term spoke for itself, but thanks for that brief dumbed down explanation. :)
    If FSC is maximizing the amount of free space between the last file and the end of the disk, and SP already incorporates FSC, then why did Raxco make FSC Defrag a seperate feature?
    I think DK beat you to the punch on that one.
    Isn't it designed to take it's time for that extra efficiency that gives it that "single pass" edge? :p

    Yes, and I'm trying to determine, hopefully with the help of honest people like Michael and yourself, which defragmentation software is the very best to use.

    One more thing, I believe that your original post contained a statement which read something along the lines of "remember, the file system is designed to fragment", which you edited out. Care to explain that one?
     
    Last edited: Nov 10, 2006
  11. PangingJr

    PangingJr Member

    Joined:
    Mar 14, 2003
    Messages:
    5,989
    Likes Received:
    56
    Trophy Points:
    0
    Jeremy of Many,

    in my opinion and experience, i always chose the software programs that i want to use from my own experiences with the software.
    i do read just a few things that i need to know about the software products that i'm about to try it, because i don't believe a lot about anyone says about it, i just need to try it, and that is the only thing that help me decided which one is the best for me.

    have you actually tried both defragmentation software products?
    i suspected that you have, but have you?
    i mean, can you determine by yourself which one is best for your needs?

    don't get me wrong, discussion is good, but unfortunately, they are not always help you to decide what is best for you. it's always need your own experiences.
     
    Last edited: Nov 10, 2006
  12. Jeremy of Many

    Jeremy of Many New Member

    Joined:
    Aug 26, 2005
    Messages:
    42
    Likes Received:
    1
    Trophy Points:
    0
    Yes, I used PD for a year or more. I've been using DK for several months now. I go by my experiences as well, but I'm fed up with users of other forums saying one liners like "<Insert name here> rules! Very fast! Previous defragger ruined my harddrive, never go with it!"
    Regarding defragmenters in particular, I like to get as much proper information as I can, especially since reps from both companies are involved in this thread.
    Quite frankly, it's not my needs I'm concerned about, it's my harddrives'.
     
  13. PangingJr

    PangingJr Member

    Joined:
    Mar 14, 2003
    Messages:
    5,989
    Likes Received:
    56
    Trophy Points:
    0
    okay.
    please discuss away.
     
  14. mmaterie

    mmaterie New Member

    Joined:
    Nov 7, 2006
    Messages:
    12
    Likes Received:
    0
    Trophy Points:
    0
    Hi Jeremy,

    (I wrote this really long post to answer your first question before I saw your most recent really long one. I'll try to fit answers directly into what I had already written. Some of what is here may be redundant from your post). If I missed anything please let me know.

    BTW: I really appreciate your investigation on this!

    ---

    I should clarify that I’m the "product marketing" guy at Diskeeper. That means I head up the direction of the products, and what new technologies or features go into them. However, I do know a good bit about the file systems. Over the years, I’ve traveled to numerous Microsoft campuses in the US and Canada to train their Premium Support Services team on Diskeeper and file system performance. I actually started at Diskeeper Corp in a technical role, but moved into this position to manage the development of Diskeeper 9.0.

    That said, I will spare you and everyone else in this thread a sales pitch. I’m here to help you better understand Diskeeper – that’s all. I hope that comes across in my response.

    As anyone who followed that link may have read in the Diskeeper blogs, I don’t advocate or believe in head-to-head comparisons by a proprietary vendor. As I stated, there is obvious bias and a strong likelihood for incorrect information presented about the competitor product either out of lack of technical expertise with the product or intentional propaganda. I can't think that data would help users make proper decisions?

    IT Professionals who spend company money want more than promises, they want results. I think that is exactly what the readers in this and other online tech forums demand from their hardware/software vendors. On that note, Diskeeper publishes papers that let you reproduce, typically with independent tools, that the product does what it says. For example the Benchmarking I-FAAST paper in our Knowledge Center details how you can verify it actually does improve file access. As I mentioned in the blog entry, I-FAAST is designed to also speed up new file writes. By design, you can get faster new file writes due to disk physics than with any other approach I have seen – that is of course just my experience of other methods. However, there is only so much influence a defragmenter can have on file system behavior, which is why we don’t heavily promote that or base a technology on it.

    As for Windows XP's boot optimization, that was a technology co-developed by Diskeeper and Microsoft. MS built the prefetch system and the filter that determines the files and their order and Diskeeper wrote the code to sequence the files on the disk (for the built in defragmenter).

    When I buy a product and a vendor tells me "we do this or that" I ask more than just "why", especially when I'm not an expert on the subject (like the nuances between Plasma and LCD). I make them prove to me that doing so actually makes a quantifiable difference. The reason they give may sound cool but be irrelevant for the purposes of the product. I see a lot of people doing this already (i.e. digging for truth) on this forum, but you have likely all seen people on some thread just buy into something on face value; hook, line and sinker.

    If you visit the Diskeeper website you won’t see us talking about other products. I don’t mean to sound conceited, but we simply don’t need to – I cover more in one of my blogs on that. That also means we don’t infer things about other products. Inference is assumptive and contentious (my personal opinion). What if one politician said in a campaign "Someone in the government recently told me they thought that my incumbent opponent may possibly be fondling one of his employees inappropriately?" It either happened or it didn’t. If it did, prove it. It’s a marketing tactic (use of wishy-washy wording) used to encourage FUD (fear, uncertainty and doubt) and "unsell" the competitor. Unfortunately it works.

    Do you need to "wonder" if Diskeeper did its job? The answer is simply, no. Does Diskeeper 2007 Automatic Defragmentation afford less control? The answer is again, no. Does I-FAAST "shuffle" files around back and forth, nope. Does placing files sequentially on the disk based on frequency of usage (not a rote file attribute) improve performance, abso-freakin-lutely! And, we actually prove it. Can you scientifically and accurately gauge disk performance without having to guess? Yes. It wasn’t easy, but our developers did it and we have the Intellectual Property (IP) protected. And, for the record it works better in production environments than in a lab because it dynamically adjusts to changes. If you used resources inappropriately to sequence files, sure, it could easily result in negative returns; that's true. Good thing then that Diskeeper does it intelligently and now invisibly. Does Diskeeper consolidate free space - yes, and we have IP protection (since 1999) on that as well. Has it improved over time - yes (just like the rest of the product), and it is an integral part of Diskeeper 2007.

    Please note that Diskeeper is actually gauging file usage frequency. Reading last modified date or last access date file attributes does not substantiate sufficient knowledge about usage to justify saying that an ongoing action is based on "frequency of usage". The I-FAAST system is far more advanced. It is actually learning about your system and how much of your PC's time is spent reading existing files, modifying them, or creating new ones. All that info let's it speed up file access. I'm not saying another defragmenter needs to go to those lenghts for it's file strategy purposes, but for I-FAAST to deliver on its promise, all of that is vital.

    Many of the above comments are key points of development effort in the new version. Without InvisiTasking you could not run in real time. We install Diskeeper in an automatic mode because we are 100% confident in the technology, we know it works. Why schedule when you don’t need too? The new graphical control panel for Diskeeper 2007 in the dashboard offers greater flexibility that ever before. We built the ability to turn if off at certain times for those who are very sensitive to the thought of real time defrag (we understand it is a new concept for many and it can take some getting used too).

    There is a falsely propagated myth that Diskeeper does not do free space consolidation, or de-emphasizes it. That is simply untrue. What we don't do is over-hype it's value. It is a popular topic and I did cover it in good depth on the Diskeeper blog. I hate to keep referring readers off this site, but it is a whole separate and very lengthy topic.

    That leads me to another thing any software/technology buyer should consider; white papers that originate from, or are funded by, a vendor should always be suspect. I don't mean to sound like your momma (my apologies to anyone reading this if I've come across that way), but you know that many people believe everything they read or see. Don't get me wrong, I’m not saying all vendor papers are bogus, just a heads-up to be careful. In most cases, third parties are very ethical and would never compromise their ethics so most analysts can be trusted.

    (added note: I see that you just brought this up to Jeremy)

    Let’s take a hypothetical example. Hire a very intelligent guru/analyst or solicit one that works for your company. Have him detect that a certain operation of a competitor’s product, that if run in some non-standard/reverse manner, makes it return a bad result. Point to an obscure, non official, comment that supposedly justifies the operation of that competitive product in the manner undertaken in a test. And then present the results, purporting that the test is a true apples-to-apples comparison.

    An example might be a comparison between backup software vendors. What if a head to head test involved a speed comparison? If, for one product, all the bells and whistles were turned on and on the other product nothing but the basic "back up the data" was initiated, it wouldn’t be a fair test would it?

    Unfortunately that kind of testing happens more often than it should.
    I’ll end off with the statement that Best is subjective.

    If our advertising team ever says "best" or "leading" shame on them, unless they quantify it with facts. Yes, we may have used Number One, but that's substantiated with sales figures (90%+ market share, as measured by National Purchase Diary – the independant group that tracks sales in the software "channel"). That slogan was used for those unfamiliar with Diskeeper’s market share.

    While sales are a pretty good indicator, it doesn’t mean a product is the best. I think we can all agree to that. That’s why I feel it is our [Diskeeper Corp] responsibility to continually evolve the product and improve it technically. Symantec (Norton Speed Disk) was the big kid on the block in Windows defrag in the 90s. Diskeeper had a better product and supplanted them, as shown by sales. Granted Symantec, to their credit, focused on Security as they could make a lot more money in that market, and competing with our defragmenter wasn’t worth their investment.

    Perhaps the best thing about most commercial defragmenters on the market is they offer a 30-day trial period so you can test drive the software first. Diskeeper offers an unconditional 30-day money back guarantee as well. Other vendors may do the same.

    Best regards,
    Michael Materie

    PS: I also write all our technical papers such as on virtualization, system reliability, operating system architecture (how it relates to our products), file systems, etc…so if you ever have a question, please stop by the diskeeperblog and ask me. It is a 99% "marketing-free" zone.
     
    Last edited: Nov 11, 2006
  15. Jeremy of Many

    Jeremy of Many New Member

    Joined:
    Aug 26, 2005
    Messages:
    42
    Likes Received:
    1
    Trophy Points:
    0
    My pleasure. :)

    As I understand it monitors the routine of the file system activity and adapts to any changes made as well. Does it work in conjunction with Free Space Consolidation?

    The next two quotes are key in my investigation.

    (From Michael's blog post "Comparing I-FAAST")
    So frequency of use and modification date are not the exact same, thus indicating the difference between DK's and PD's defrag methods. Is either one better than the other?

    More of what I was looking for. Would you say, for anyone to get to the bottom of "Which defragmenter should I use?", could and should rely primarily on the blog posts on both www.diskeeperblog.com and http://perfectdiskblog.typepad.com/perfectdisk_blog ?

    So you're saying I-FAAST is better than SmartPlacement?

    I totally agree with you on scheduling vs. automation regarding PD and DK. However, I want to acknowledge my realization that it is my preference and in no way whatsoever is that influencing my priority or determination to gather as much proper info as possible regarding either PD or DK in this thread. If I decide in the end, based on the info that Michael and Greg give me, that PerfectDisk is better for my harddrive, then I will return to the scheduling functions of PD.
    How considerate of you. ;)

    That's fine. It's right here
    The statement that stands out for me the most is:
    So if I have 5 files, at 5 fragments a piece (as if I were to defrag and reboot have those user profile .LOGs and .DATs changed again), I don't really have much of a problem yet...
    However, for the majority of users out there, a problem does exist.

    Yes, I have several friends who have seperate partitions for C:\Documents and Settings, C:\Program Files and C:\WINDOWS. I, myself, haven't done this nor do I have the intention of doing so anytime soon. Maybe later. It makes sense that by doing so, it would reduce fragmentation (by seperating dynamic data from static data. I do, however, have two harddrives. One contains Windows and all installed programs and another contains all stored data (archives, games, movies, music, software, various documents).

    In most cases, yes. However, regarding defragmentation software, one has to be healthier for the drive than others. Unless your answer to the next question is "Yes".

    Are you saying that neither DK or PD are better than the other when it comes to defragmentation as a whole?[/QUOTE]
     
  16. mmaterie

    mmaterie New Member

    Joined:
    Nov 7, 2006
    Messages:
    12
    Likes Received:
    0
    Trophy Points:
    0
    Hi Jeremy,

    I feel it's not my place (given I represent a vendor) to say one is better than the other or dish out sales pitches in a technical forum. That is for you and other readers to discuss. I'm here to help explain how our products work and why they do what they do, and I think Greg is doing the same. My purpose is not to attack Greg or Raxco's product. Raxco makes good products and Greg is a smart guy who has repeatedly earned the MS MVP award, which proves his desire to help by sharing his knowledge.

    I'll piece together answers to your latest questions:

    -As for I-FAAST being better than SmartPlacement, that is for you to decide. I think the question is really what is better for you; which depends on what is important for you. They are different technologies with different purposes. How the technologies gather the data is relevant only in so much as how they need to apply the data they get. If Diskeeper's I-FAAST were to rely on Last Modified Date, it would limit it's effectiveness dramatically. The only suggestion I have, is that when you are evaluating what is better, that you get pick the best product based on real results, not based on what vendor has the most attractive sales pitch.

    -I-FAAST does implement free space consolidation as part of its overall performance strategy.

    -Keep in mind that when you use a third party file system performance utility to sequence and/or place files, (especially I-FAAST) you'll get better performance that partitioning by data type. I still recommend certain partitioning strategies based on disk spindles - such as placing the paging file on a separate spindle than the boot partition (i.e. \Windows).

    -If you prefer scheduling over automation, that is your prerogative. As I noted before, the ability to turn it off, makes it relatively similar to scheduling in that you can decide to run it only when you allow it. I just spoke with a major bank today that is rolling out DK 2007 to all their workstations. They are turning off DK during evening batch jobs, not because it's not invisible, but simply because it's company policy to never run anything while the batch jobs run. It's not my business to tell them that's wrong. What's right for them is what's right for them.

    -The blog sites are the vendor's way of providing data in a new format. I'm a big fan of Microsoft's blogs - there is an unbelievable wealth of info on their blogs. Much better, in my opinion, than on their website. As a vendor of any product there are company rules and procedures for promoting and selling and narrow channels for getting that information out to customers. In most cases they are conservative and relatively terse. Company blogs give individuals the ability to expound on whatever they want to talk about in a less formal manner and without a great deal of company effort (often official company docs are heavily scrutinized and go through a long approval process before publishing). I agree with your comment that most company blogs are good sites to get info. I try to make the Diskeeper blog a purely technical site. We may slip in the occasional PR announcement, but you can always skip over those if you don't care to read them. I've have seen blogs that only serve as a marketing extension and don't really provide value.

    -On your question about number of fragments, it's all relative. Humans, per studies from research fields like HCI - Human Computer Interaction, are less sensitive to slows than applications - at least from a performance standpoint. Us humans can usually live with a few seconds delay before we complain. That's not to say a computer can't, but they move inexorably faster, so holding up one application a few seconds is far more pronounced from a performance standpoint. In other words, applications tend to be more sensitive. Your applications will definitely notice 2 seconds (such as video editing for example). Imagine if you bought a DVD on Amazon and the purchase transaction was delayed by 2 seconds - you'd live with it. However, if you're an Amazon exec and every transaction was slowed by two seconds, it means a whole helluva lot less money for you. The point is that 5 files in 5 fragments each is not going to make your PC noticeably faster - at least not by you. Yes it will get worse, and in almost all cases be noticeable by the user. Fragmentation will start to affect your computer's ability to produce well before the human interactive experience is impacted.

    As the product manager I think Diskeeper is close to perfect, but I also know there are things we can improve or newly invent (we've already started on the next version, and I have another whole version nearly planned after that one). What I always want to hear is what else we can do to make it better (wishlist@diskeeper.com). I think that holds true for almost every manufacturer/service provider in any industry. Companies that don't listen to their customers are doomed to fail. Raxco products have improved over time as well.

    -Now, can you test what product uses the least overhead to keep the system tip top - yes. Can you determine which product incurs the least (or no) effective overhead - yes. Can you test which product speeds up performance the most - yes. Will one product generally do better than another - yes.

    I understand that most consumers don't have the time, or ability, or even the desire to go through all that testing, but I don't really think they need to. I just don't see grandma running PCMark or Perfmon on two live running identically imaged volumes to test I/O overhead and file read time benchmark scores! System Administrator's, on the other hand, do this kind of thing.

    Diskeeper is a great solution and I believe its results and reports go a long way to proving that.

    -Michael Materie
     
  17. Chaos

    Chaos Number Nine

    Joined:
    May 9, 2002
    Messages:
    5,260
    Likes Received:
    95
    Trophy Points:
    0
    once again I want to thank Raxco and Diskeeper staff for making this thread as informative and interesting as possible without attacking each others product line.
     
  18. rms13

    rms13 New Member

    Joined:
    Nov 11, 2006
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    0
    To Jeremy:

    I think that we are asking here the wrong question. It should be - not which one is better, but first - do we need to pay $40 or $100 in the first place.
    $100 is a half of what Windows XP Home costs, so it is better to think twice before purchase. Of course, I am talking about home user - on the server, need for automatic defrag is usually undeniable.

    No doubt, commercial people of both companies are working hard, but they are not telling the whole truth, more like creating an illusion that we can knowingly make our choice. But, how can you compare, and have anybody actually done that?
    This business is somewhat like that with oil additives. People want to believe that some product will do magic, and will better pay money for such 'miracle product' than change oil as appropriate.

    Have a look at the examples to prove how much defragmenters will increase perfomance. I have no idea where they have obtained such disks, but they are not from the real computer. I have serviced computers for years, but haven't seen anything similar. Even on NT computer that was working for at least 6 years without any defragmentation 24/7, disk was not that bad.
    But you should compare to normal disk, not over-filled, defragmented at least weekly with Windows defragmenter. How big the effect will be then, really $100 worth?

    Here are some results.
    Test computer is Toshiba P4 laptop, with 1G memory and 60G 7K60 drive. It is partitioned in 34(C drive) and 22 GB partitions, C partition 40% free, data storage partition 5% free.

    First, Diskeeper2007-ProPremier latest trial version was installed, and after 1 week of operation (at least 10hours/day, IFAAST enabled), measurements were done - Windows startup till login screen; Lotus Notes till login screen and till messages are displayed; Photoshop startup; Autocad startup. Lotus notes was used daily, Photoshop and Autocad - just for test.
    Then, Diskeeper was uninstalled and replaced by PerfectDisk8 latest trial version, smart placement defragmentation(with startup optimization set to 'let PD manage) and the same measurements performed.
    Then, PD uninstalled, drive defragmented with Windows defrag(it significantly restructures disk - layout.ini file placement is completely different in PD and Win approach) and again the same measurements. All given times are average of 3 measurements in seconds.

    Diskeeper Startup - 33.0; Lotus - 6.4 + 7.7; Photoshop - 18.3; ACAD - 27.3
    PerfectDisk Startup - 33.6; Lotus - 7.2 + 8.4; Photoshop - 17.9; ACAD - 28.6
    Windows defrag Startup - 31.2; Lotus - 6.6 + 7.3; Photoshop - 18.2; ACAD - 26.8

    Now, have I got it wrong, did I measure not correctly, or I have not measured the right thing - but where are those $100 worth improvements ?

    And then, if we want to see what really slows down the computer, do a simple experiment. Create something like 50 000 small files. It can be done with script like this (in command window) - FOR /L %f in (1,1,50000) do echo abc > %f . It will create 50 000 files with text abc in current directory. Then select 10 000 files, right click and enjoy:(. Note that all those files are stored entirely inside the MFT, so no defrag can do anything about it. And, by the one, you will have discovered a free substitute for Frag Shield:D. You can delete all those files(it will take time), but MFT size will stay increased (~1K for a file), as it cannot shrink.


    To Greg/Raxco Software :

    As you see, your boot optimization does not work. It does make much sense when described on paper, but in the reality Windows defrag places those files very differently and wins(I have measured it on other computers, and results were even more impressive in favour of Win defrag).
    I have raised this question with your support, but the best answer was that I am free not to use the feature! Obviously, they were in belief that I have paid for the software already. Maybe you have any better comment?

    Smart placement - I have tested with some dll's in system32 folder, which are not listed in layout.ini file. Simple cnanging last modified date and next defrag pass moves the file to the end of drive. You may call it smart, but it does not seem to me that way. Just wondering about patent system. Maybe it is the reason you haven't included the feature to locate file on disk. Your tech support redirected me to some article which says it is not done for sake of resource economy. I am running DiskView and wondering, does it really use that much resources.

    To Michael/Diskeeper :

    As you see, I haven't noticed any measurable perfomance increase from your $100 program. Could you please explain, where I am wrong?

    What I have noticed though, is that if run in manual mode on data disk (5% free), it performs like simple multi pass defrag - every time it is run, it is trying to move some data around, works for some time, but fails without doing the job. Looks not better than Windows built-in defrag.

    Manual Analysis Job Report displays information that is simply not true - something like 'Warning! The MFT usage is currently 90 percent of total MFT size, which indicates it is likely the MFT will become fragmented.'
    Why would it, if there is at least 500% of reserved space? Is this statement intended to scare somebody who does not understand how MFT thing works, and to help sell the air bubble Frag Shield?

    Finally, and most importantly, the idea behind IFAAST is really great for server.
    But, what about home computer or even workstation? What makes you think that what I do most frequently, is the most important thing I have the computer for? It could be a computer on the Shuttle, where I maybe could play games for 2 weeks - but I still need it for landing.

    If you really wanted to make an 'intelligent' defrag for home workstation, and it applies to PerfectDisk too, the best would be to speed up the system, not applications or data file access times. But that does not require any real time learning - that is, if you know how Windows works.
    Or, if you really wanted to justify $100, it could be IFAAST with 'learning mode', where information woud be gathered about most important activities and then applied to manual defrag passes - without anything automatic running in the background (at least, as an option). You are not going to say that you can monitor file system without using any resources at all - to the people who have disabled most of Windows services and gained a measurable improvement?

    Of course, current automatic mode could be left for those who want to believe in magic - would that be most of your customers? :p

    Thanks.
     
    Last edited: Nov 11, 2006
  19. mmaterie

    mmaterie New Member

    Joined:
    Nov 7, 2006
    Messages:
    12
    Likes Received:
    0
    Trophy Points:
    0
    Part 1:

    I agree that everyone needs to evaluate these products. Every general purpose file system in a commercially available OS generates file fragments over time - that is fact. I'm assuming the people reading this thread are familiar with that, but if anyone wants proof, google it and you'll find plenty of reputable sources.

    I don't think any major defragmentation vendor is pushing anyone to buy out of fear. If some communication comes across that way, let the vendor know - marketing people make mistakes. I can't speak for all vendors of this type of software, so some may not be ethical. One unethical vendor does not mean that they all are.

    The primary difference between a server and a workstation is that a server hosts more data, users, etc. Every defrag vendor I've seen discuss fragmentation states quite openly that fragmentation is proportional to the amount of file modifications, writes, and deletes. If the computer does nothing but read files on the disk all day long, apart from background system activity, you're not going to see much fragmentation - and it's highly unlikely to become a performance issue.

    If you use your PC for internet browsing and web based email, you can probably go a very long while before a defragmentation job makes a performance difference for you.

    Due to my industry affiliation, I receive a great deal of free commercial software. I have a half dozen antivirus and antispyware programs sitting in unopened boxes. I have free licenses to install them on my home PC's but I do not. It's not that they aren't good products - they are. I don't install them because I don't need them. I know to avoid certain places on the internet and to not open attachments. I have a background in security (I used to work in a technical role in that industry). For another person, that may not be the case, and anti-malware may be a crucial part of keeping their PC's operational and performing well.

    If you try a piece of software and you can't perceive any benefit from the software, DON'T BUY IT.

    It's unfortunate that you feel there is withheld information here. I already stated earlier in the thread I understand that home users are not going to have the time or wherewithal to undergo scientific analysis. That's why the media (who speak freely), and the idea of freedom of speech are so infinitely vital to protect consumers/citizens. In the PC world, there are numerous media experts and IT analysts out there to help buyers/companies make the right decisions.

    Exactly how is this business akin to that of selling "oil additives". Is that entirely due to your evaluations? So that I'm not misunderstanding you, what exactly are you insinuating? Is everyone else who recommends defragmentation a snake-oil dealer or friends with one? So Mark Russinovich, Paul Thurrott, Jim Allchin, Walter Mossberg, NTFS file system developers, and hundreds more are all on some defrag vendor's payroll or just hawking thier own mystery oil? Is any vendor telling people not to "change their oil" per your analogy.

    I understand your are segmenting the consumer from the business machine/server, but that isn't the dividing line. It simply comes down to how much a computer has to do with respect to disk I/O dynamism; company computer or home PC.

    I understand your speculation and understand personal experience drives your beliefs. I can't argue that - cause I'm the same way.

    Keep in mind that major defrag manufacturers make 90% of their income selling to businesses. Diskeeper marketing (with exception to technically simplified emails sent to home users who download trialware) is 100% geared to IT professionals.

    Apparently I need to re-iterate that I'm not in this forum to sell 2 or 3 licenses to "gullible victims". And, I'm not here pushing people or scaring them to go buy. I'm here to help answer questions that users have about Diskeeper. That's it!

    I feel like singing a Mel Brooks tune. Maybe something from History of the World Part 1. Something with dancing nuns should do...[​IMG]
     
    Last edited: Nov 12, 2006
  20. mmaterie

    mmaterie New Member

    Joined:
    Nov 7, 2006
    Messages:
    12
    Likes Received:
    0
    Trophy Points:
    0
    Part 2:

    I think you may have misunderstood some of the purposes of the products and how they work. A defragmented file is a defragmented file, no matter what product you use to get it there. The differences are whether one product can get there, and what resources/impact it takes to do so. In a corporate market, other factors like automaticity and management capabilities also weigh in more heavily.

    For a scientific test you need to always begin with an identical starting platform, and you can't sequentially run one product after another. I'm assuming the tests were timed programmatically (i.e. not by stop watch)?

    Are 3 test runs sufficient to come to a conclusion? If you took statistics and probability courses you may recall the variable for error on only 3 tests? http://en.wikipedia.org/wiki/Margin_of_error.

    Was caching addressed to ensure the reads were direct from disk?

    I also don't believe Raxco makes claims that file access is sped up beyond defrag with their features (they claim to slow re-fragmentation), but I'll leave the detailed explanations to them.

    Keep in mind I-FAAST is an ongoing feature - not a once and done manual job. As your disk fills and the layout files scatter (it will happen - just look inside the file to see what is in there) Windows will end up, over time, pushing the file further back on your volume as it seeks to find a single large chunk of free space to place the entire group into. That is due to NTFS file allocation algorithms which will fragment the free space as it writes newly modified files nearby the original. As that happens, watch your boot up get slower and slower. Maybe the layout was ideal that day? Also, what did I-FAAST infrom you it would provide performance wise? Can you confirm that the files/processes you tested are now considered by I-FAAST to be frequently used and I-FAAST has actually processes them?

    If you run I-FAAST prior to the built-in, I-FAAST has already placed the boot files. Why would the built-in move them?

    If you don't reboot, log out, disable all services, and never use the disk for anything (store new files on it, etc...), then those numbers will stay the same.

    Yes, creating files causes significant I/O overhead. Is that your point?

    That script, which has been available at numerous system tweak sites for years, is not unlike what Frag Shield does.

    Sounds like a rhetorical question to me, but I'll bite anyways.

    In addition to all the questions on the testing that I've already asked, was anything related to the Photoshop, Autocad or Lotus fragmented initially? If not, defragmentation won't improve performance. Where were they were located to begin with? What tools did you use to measure (bootvis)?

    If you are an IT Professional (or scientist, or mathematician), you know that results are only valid by following a scientific method and minimizing/eliminating the error margin. Following and documenting the method itself is just as important as the results.

    If you review the Benchmarking I-FAAST paper, you'll see several pages of test methodology. When I write a paper I ensure anyone else can repro the results. If someone else can't repro test results, you do not have a case - you have an anomaly.

    I'm not invalidating your testing, I'm just pointing out that your methodology, in detail, is absent from your results.

    I agree the warning is dramatic, I can get this corrected in a future release.

    I'm not sure where you get 500% from? Your upset with Frag Shield is duly noted. But I'm not sure why? According to your logic the MFT would never fragment? I don't think I need to go into detail on that aspect as it was covered earlier in this thread, do I?

    For the record, Diskeeper is not designed as a manual defragmenter. Very little improvements are made in that area. As I noted, Diskeeper is an enterprise product focused on running automatically and silently. Multi-pass is a strategy for minimizing resource usage (specifically I/O overhead) - it was never built for manual defragmentation begin with. If you want a manual defragmenter, I'd suggest you look at other products.

    I-FAAST speeds up file access above simply defragmenting. It's also continually learning and updating its information so as the usage changes, so will the file sequencing. And, if a file is ideally situated already, it keeps it from moving elsewhere where access speed would be reduced. In v2007, we also added the ability to hand-select files should situations as you describe be the case.

    You're entitled to your opinion. Given that the disk is typically the slowest component in a modern PC, optimizing that component returns results. I agree that there are many drivers/services that many people won't need and the Windows Service Host is getting more and more bloated, but we need to remember that Windows is a general purpose OS. It is kind of a one-size-fits-all system.

    There are many forums and guides that anyone can follow that educate on how to tweak the system, as well as products on the market that automate it.

    I'm not a world-class expert on Windows and file systems, though I am very proficient in their design, architecture and operation, and have written several technical papers that discuss these subjects already. I study up on it, and personally explore the system with various tools, so I can be better at my job working with our developers - who are world class experts on file systems and operating system internals.

    I'm not sure why you are goading me here? InvisiTasking relies on default Windows system monitors to operate. I've noted this on other forums as well.

    These aren't the droids you're looking for...[​IMG]

    As I noted, most of our customers are IT Professionals. Are they fooled by slight of hand? I seriously doubt it, but I don't know because disk defrag is legitimate. I know they don't by RAM-defragmenters if that is where you are headed. IT Professional's test the product for weeks/months and benchmark performance on cross sections of desktops and laptops. They use various tools like PerfMon (looking at I/O counters) and benchmarking software to justify a purchase.

    I also don't think a great many users active on technical forums are as gullible as you suggest either. Are there people, as Jeremy pointed out, that simply conclude "it rocks!" because of a pretty UI, of course.

    Anyways... If anyone wants to ask me about how Diskeeper works, I'm always available to answer questions. You can contact me at the company blog, anytime. I'm restricted for time the next few weeks, so if I do return here to answer some questions, please understand I can't keep up these novels. Thanks for bringing up some good questions and considerations.

    -Michael Materie
     
    Last edited: Nov 12, 2006

Share This Page

visited