Faster iteration over thousands of files

I'm trying to do something on ~200,000 files in a single folder.

When I do this:

for i in *; do /bin/echo -n "."; done

One dot is printed every few seconds. The same operation on a folder with a hundred files works blazingly fast.

Why is this so? How to accelerate the process for folders with thousands of files?

Answers


Try this with GNU find:

find . -maxdepth 1 -type f -printf "."

Need Your Help

Get data from RowCommand

c# asp.net gridview rowcommand

I have a grid which shows product version's and have a few link buttons like edit, delete, preview etc.

Android Xml View References

android xml view android-view

Is there any list on the web of all androids xml view types (TextView, ImageView, Button...)

About UNIX Resources Network

Original, collect and organize Developers related documents, information and materials, contains jQuery, Html, CSS, MySQL, .NET, ASP.NET, SQL, objective-c, iPhone, Ruby on Rails, C, SQL Server, Ruby, Arrays, Regex, ASP.NET MVC, WPF, XML, Ajax, DataBase, and so on.