Probably the most direct way to do this would be to create an SSIS package with a fixed width Flat File as a data source. I'd set your destination as something notably different than your final table will be (i.e. use a "rawdata" schema or even a separate database named "rawdata"). You could use this initial mapping to create the table with suggested data types (but you really need to make sure you know your data).
SSIS
Once you've done this, you could create a ForEach Loop container in SSIS that iterates through a specific folder that has these flat file sources and your raw destination. If this is a one-time thing, you can do this by installing SQL Server Data Tools to your current Visual Studio 2010 installation (if you have one), or the installer will install SSDT within VS 2010 Shell if you don't have it. If this isn't a one-off, then you're going to need to look into your licensing for installing SSIS in order to deploy this as a package that you can schedule to run periodically.
Failing all that, you could generate the table schema by opening SSMS, right-clicking your target database, clicking Tasks, Import Data, choose Flat File source, fixed width destination, mapping your source (you're going to go through the pain of creating these columns, if you really have > 100), and then generating a create table script.
However, if you really only have those three columns ...
CREATE TABLE schema rawdata AUTHORIZATION <pick an owner here>;
CREATE TABLE rawdata.rawfields (
fieldname varchar(8), -- i'm actually unclear here because
-- the next column's name/purpose is unclear
fieldsize int, -- or other numeric datatype
fieldcomment varchar(255)
);
Then you could write a quick C# or PowerShell script to iterate a directory with the files. This would look something like:
C#
string fieldname, fieldsize, fieldcomments;
var files = System.IO.Directory.GetFiles("path", "*.ext");
foreach (var file in files)
{
var lines = System.IO.File.ReadAllLines(file);
foreach (var line in lines)
{
fieldname = line.Substring(0, 8);
fieldsize = line.Substring(9, 1);
fieldcomments = line.Substring(10, 73);
}
}
From here, I'd set up a connection to the SQL Server instance, create a parameterized insert command and insert the data in the variables.
BULK INSERT
Create a format file:
<?xml version="1.0"?>
<BCPFORMAT
xmlns="http://schemas.microsoft.com/sqlserver/2004/bulkload/format"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<RECORD>
<FIELD ID="1" xsi:type="CharFixed" LENGTH="8"/>
<FIELD ID="2" xsi:type="CharFixed" LENGTH="2"/>
<FIELD ID="3" xsi:type="CharFixed" LENGTH="68"/>
<FIELD ID="4" xsi:type="CharTerm" TERMINATOR="\r\n"
</RECORD>
<ROW>
<COLUMN SOURCE="1" NAME="fieldname" xsi:type="SQLCHAR" />
<COLUMN SOURCE="2" NAME="fieldlength" xsi:type="SQLCHAR" />
<COLUMN SOURCE="3" NAME="fieldcomments" xsi:type="SQLCHAR" />
</ROW>
</BCPFORMAT>
then you can write up a PowerShell script or a C# app to iterate the files in the directory (as above) and call (assuming you can gain a trusted connection)
bcp <<yourdatabase>>.rawdata.rawfiles in \\path\to\data\file.ext
-f \\path\to\format\file.Xml -T
otherwise
bcp <<yourdatabase>>.rawdata.rawfiles in \\path\to\data\file.ext
-f \\path\to\format\file.Xml -U username -P password