6

The snippet of code was just supposed to write a string into a text file called "all_results.txt". I had errors implementing in File.WriteAllText. After searching the net for solutions, I tried using FileStream and StreamWriter as substitutes. The problem still persists.

It gave me:

IOException Unhandled: The process cannot access the file 'C:\Users\MadDebater\Desktop\ConsoleTest1\ConsoleTest\bin\Debug\all_results.txt' because it is being used by another process.

Strangely, the errors occurs arbitrarily. It could be during the 3rd loop, or 45th loop before it hits an error. I provided the full code for that class in case the problem is deeper than it seems. I'm sure it has nothing to do with my virus scanner or anything like that.

try
                {
                    using (FileStream stream = new FileStream(@"all_results.txt", FileMode.Create)) // Exception here
                {
                    using (StreamWriter writer = new StreamWriter(stream))
                    {
                        writer.WriteLine(result);
                        writer.Dispose();
                        writer.Close();
                    }

                    stream.Dispose();
                    stream.Close();
                }

                }
                catch (IOException ex)
                {
                    Console.WriteLine(ex);
                }

Even when I try this, it still fails.

try
                {
                    File.WriteAllText(@"all_results.txt", result); // Exception here
                }
                catch (IOException ex)
                {
                    Console.WriteLine(ex.Message);
                }

Below is the full code for the class. It is meant to take in a list of Twitter tweets and classify them using Bayes Classification one by one.

    using System;
    using System.IO;
    using System.Collections.Generic;
    using System.Linq;
    using System.Text;
    using BayesClassifier;
    using System.Text.RegularExpressions;

    namespace ConsoleTest
    {
        class Analyzer
        {
            public static void Analyzing(List<string> all_results)
            {

            Reducting(all_results);
            Classifying();
        }

        public static void Reducting(List<string> all_results)
        {
            //Reductor
            //Precondition: List<string> results
            all_results.ForEach(delegate(String text)
            {

                const string ScreenNamePattern = @"@([A-Za-z0-9\-_&;]+)";
                const string HashTagPattern = @"#([A-Za-z0-9\-_&;]+)";
                const string HyperLinkPattern = @"(http://\S+)\s?";
                string result = text;

                if (result.Contains("http://"))
                {
                    var links = new List<string>();
                    foreach (Match match in Regex.Matches(result, HyperLinkPattern))
                    {
                        var url = match.Groups[1].Value;
                        if (!links.Contains(url))
                        {
                            links.Add(url);
                            result = result.Replace(url, String.Format(""));
                        }
                    }
                }

                if (result.Contains("@"))
                {
                    var names = new List<string>();
                    foreach (Match match in Regex.Matches(result, ScreenNamePattern))
                    {
                        var screenName = match.Groups[1].Value;
                        if (!names.Contains(screenName))
                        {
                            names.Add(screenName);
                            result = result.Replace("@" + screenName,
                               String.Format(""));
                        }
                    }
                }

                if (result.Contains("#"))
                {
                    var names = new List<string>();
                    foreach (Match match in Regex.Matches(result, HashTagPattern))
                    {
                        var hashTag = match.Groups[1].Value;
                        if (!names.Contains(hashTag))
                        {
                            names.Add(hashTag);
                            result = result.Replace("#" + hashTag,
                               String.Format(""));
                        }
                    }
                }

                // Write into text file
/*
                try
                {
                    using (FileStream stream = new FileStream(@"all_results.txt", FileMode.Create)) // Exception here
                {
                    using (StreamWriter writer = new StreamWriter(stream))
                    {
                        writer.WriteLine(result);
                        writer.Dispose();
                        writer.Close();
                    }

                    stream.Dispose();
                    stream.Close();
                }

                }
                catch (IOException ex)
                {
                    Console.WriteLine(ex);
                }
                */
                try
                {
                    File.WriteAllText(@"all_results.txt", result); // Exception here
                }
                catch (IOException ex)
                {
                    Console.WriteLine(ex.Message);
                }

            });
        }

        public static void Classifying()
        {
            // Classifying

            BayesClassifier.Classifier m_Classifier = new BayesClassifier.Classifier();


            m_Classifier.TeachCategory("Positive", new System.IO.StreamReader("POSfile.txt"));
            m_Classifier.TeachCategory("Negative", new System.IO.StreamReader("NEGfile.txt"));

            Dictionary<string, double> newscore;
            newscore = m_Classifier.Classify(new System.IO.StreamReader("all_results.txt"));

            PrintResults(newscore);
}

        public static void PrintResults(Dictionary<string, double> newscore)
        {
            foreach (KeyValuePair<string, double> p in newscore)
            {
                Console.WriteLine(p.Key + ", " + p.Value);
            }

            List<string> list = new List<string>();
            using (StreamReader reader = new StreamReader("all_results.txt"))
            {
                string line;
                while ((line = reader.ReadLine()) != null)
                {
                    list.Add(line);          // Add to list.
                    Console.WriteLine(line); // Write to console.

                }

                reader.Close();
            }

            //PrintSentiment(newscore);
        }

        public static void PrintSentiment(Dictionary<string, double> newscore)
        {

            // if difference < 2, neutral
            // if neg < pos, pos
            // if pos < neg, neg

            double pos = newscore["Positive"];
            double neg = newscore["Negative"];
            string sentiment = "";

            if (Math.Abs(pos - neg) < 1.03)
            {
                sentiment = "NEUTRAL";
            }
            else
            {
                if (neg < pos)
                {
                    sentiment = "POSITIVE";
                }
                else if (pos < neg)
                {
                    sentiment = "NEGATIVE";
                }
            }

            Console.WriteLine(sentiment);


            // append tweet_collection to final_results <string> list
            // append sentiment tag to the final_results <string> list
            // recursive
        }
    }
}
6
  • Any reason you're not using a TextWriter? Commented Sep 27, 2010 at 12:54
  • 1
    Also I have found errors like this in the past when writing to a text file in a tight loop. The file does not get released in time for the next iteration to use it, and this could be the problem, especially if there is a lot of text to write Commented Sep 27, 2010 at 12:56
  • @w69rdy Not familiar with TextWriter. Is there a workaround for the tight loop problem? Commented Sep 27, 2010 at 13:53
  • the best way to handle it is to keep the file open until you have finished writing to it. This should avoid the problem you are having and will also prevent any other processes from opening whilst you are trying to use it. Commented Sep 27, 2010 at 14:02
  • @w69rdy Can you show me how this can be done through the code please? It's much clearer that way. I'm not experienced with files. Commented Sep 27, 2010 at 14:30

10 Answers 10

3

Dont call Dispose() and Close() on the FileStream and StreamWriter, this will be handled automatically by the using-clause.

Sign up to request clarification or add additional context in comments.

3 Comments

Even if I just use the File.WriteAllText() code that does not use the Dispose() and Close() methods, the problem persists.
StreamReader is also closed manually in PrintResults method. Try without that call.
Are there more than one thread? It seems like you are overwriting your own file and sometimes its locked by yourself in another thread. Why are you using a ForEach with a delegate instead of a normal foreach(string text in all_results)?
2

use a utility like filemon to check which processes are using the file.

UPDATE: From what i read Process monitor is very much similar to filemon. from either of these tools you can find which process accessed your file at what point. you can add a filter for your file before you start monitoring.

the other thing you could try is to get a lock on the file if it exists.

10 Comments

Filemon is no longer supported in windows according to a google search. I tried Process Monitor but I'm not sure how to identify which process is using the file. Explain?
You want Process Explorer. Fire it up, hit CTRL-F, and type in part of the file name, it will show you all the handles that match the name.
Got it! When the error occurs during debugging, Process Explorer shows this: ConsoleTest.vshost.exe (Process) 4372 (PID) Handle (Type) C:\Users\MadDebater\Desktop\ConsoleTest2\ConsoleTest\bin\Debuall_results.txt (Handle or DLL). What should I do next?
It ran this code when the exception was passed --> File.WriteAllText(@"all_results.txt", result);
Second time I ran it, the results were: <Non-existent Process> (Process), 3712 (PID), Handle (Type), C:\Users\MadDebater\Desktop\ConsoleTest2\ConsoleTest\bin\Debug\all_results.txt (Handle or DLL).
|
1

Maybe the file is accessed by virus scanner or windows indexing service?

1 Comment

Shut off the virus scanner, Google Desktop, and the windows indexing service. No luck.
0

Try writing the file to another directory outside of the debug folder.

1 Comment

Tried using c:/all_results.txt
0

Just a "wild shot" - does it help if you place the file in a more predictable location like 'c:\all_results.txt'?

Comments

0

try putting Thread.Sleep(1000) in your loop. Like someone mentioned above, the file doesn't always get released in time for the next iteration of the loop.

Comments

0

As others have stated, opening and closing the file repeatedly might be the issue. One solution not mentioned is to keep the file open for the duration of the processing. Once complete, the file can be closed.

2 Comments

Can you show me how this can be done through the code please? It's much clearer that way.
Open the FileStream before the foreach loop. Inside the loop call .WriteLine(). After the foreach has finished, close the FileStream.
0

Pedro:

As others have stated, opening and closing the file repeatedly might be the issue. One solution not mentioned is to keep the file open for the duration of the processing. Once complete, the file can be closed.

Or, alternatively, collect your text in a StringBuilder or some other in-memory text storage and then dump the text to the file once the loop finishes.

3 Comments

The Bayesian classifier takes in a .txt file as an input unfortunately so I have no choice. Dumping the whole thing in will not give me a classification for 1 Twitter tweet.
Then you can try writing a single tweet on a file, so that no 2 iterations write to the same file. This is all done to give the OS enough time to close the file and release its handle. Knowing all the file names you have written to can allow you to feed them one by one to the classifier.
Good point about in-memory being another alternative to repeated file writes.
0

I found the post while I had similar problem. The given advises gave me an idea! So for this purpose I wrote the following method

public static void ExecuteWithFailOver(Action toDo, string fileName)
{

    for (var i = 1; i <= MaxAttempts; i++)
    {
        try
        {
            toDo();

            return;
        }
        catch (IOException ex)
        {
            Logger.Warn("File IO operation is failed. (File name: {0}, Reason: {1})", fileName, ex.Message);
            Logger.Warn("Repeat in {0} milliseconds.", i * 500);

            if (i < MaxAttempts)
                Thread.Sleep(500 * i);
        }
    }

    throw new IOException(string.Format(CultureInfo.InvariantCulture,
                                        "Failed to process file. (File name: {0})",
                                        fileName));

}

then I used the method in the following way

    Action toDo = () =>
                   {
                       if (File.Exists(fileName))
                           File.SetAttributes(fileName, FileAttributes.Normal);

                       File.WriteAllText(
                           fileName,
                           content,
                           Encoding.UTF8
                           );
                   };

    ExecuteWithFailOver(toDo, fileName);

Later analyzing the logs I have discovered that the reason of my troubles was an attempt to act against the same file from the parallel threads. But I still see some pro-s in using the suggested FailOver method

Comments

0

Try using lock around your file operations. http://msdn.microsoft.com/en-us/library/c5kehkcz.aspx

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.