0

I need to find a way to insert a file header (not column header) and dump my data after that.

How would you do that ?

Regards,

2
  • Dump to the terminal and redirect it to the existing file (with header already in it) using >> (append) Commented Jun 3, 2010 at 9:54
  • How do you dump to the terminal with select into outfile ? Commented Jun 3, 2010 at 12:28

4 Answers 4

1

Not quite a "clean" solution, but this works.

Linux or Unix terminal:

prompt$ echo YourFileHeader > aFile.txt ; mysql -u YOUR_USER -p YOUR_DATABSE --execute="SELECT ..." >> aFile.txt

Windows command window (you have to type the commands one by one):

c:\> echo YourFileHeader > aFile.txt
c:\> mysql -u YOUR_USER -p YOUR_DATABSE --execute="SELECT ..." >> aFile.txt

MySQL will prompt for your password.

I do believe select ... into outfile... is a bit slow, I've had always better results using this solutions. Downside: The query output will have the field and line default terminators (\t and \n respectively)... I don't know if there's a way to tweak this from shell.

Hope this works for you.

Sign up to request clarification or add additional context in comments.

2 Comments

I finally used something like that except it is on GNU/linux. Thank you.
A long time since this answer... but about the downside I mentioned... you can tweak it in Linux/Unix using 'sed'. You can see an example here: lowfatlinux.com/linux-sed.html
1

We can achieve this using UNION function as below.

SELECT * INTO OUTFILE "/tmp/COD-Shipment-Report.csv" FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' FROM 
( SELECT 'Shipment Id' , 'Invoice No' , 'POD Number'  , 'Order Id'  , 'Shipment Created' , 'Shipment Created-TS' , 'Weight' , 'Qty'    , 'Total Amount'  , 'Courier Agency' , 'Payment Method' , 'Billing State' FROM DUAL 

UNION 

SELECT  shipment.increment_id 'Shipment Id', IFNULL(invoice.increment_id,'') 'Invoice No', shipment.track_no 'POD Number', shipment.order_increment_id 'Order Id', DATE_FORMAT(ADDTIME((shipment.created_at),'05:30:00'),'%d-%m-%Y') 'Shipment Created', ADDTIME(shipment.created_at,'05:30:00') 'Shipment Created-TS', shipment.shipping_weight 'Weight', shipment.total_qty 'Qty', sum(shpitm.qty*shpitm.price) 'Total Amount', shipment.track_title 'Courier Agency', payment.method 'Payment Method', IFNULL(shipping.region,'') 'Billing State'  FROM   sales_flat_shipment_grid shipment JOIN sales_flat_shipment ship ON shipment.entity_id=ship.entity_id JOIN sales_flat_invoice invoice ON invoice.order_id=ship.order_id JOIN sales_flat_shipment_item shpitm ON ship.entity_id= shpitm.parent_id JOIN sales_flat_order_payment payment ON shipment.order_id=payment.parent_id JOIN sales_flat_order_address shipping ON ship.shipping_address_id=shipping.entity_id   WHERE payment.method='cashondelivery' AND   DATE(ship.created_at) BETWEEN SUBTIME('2011-12-01 00:00:00', '05:30:00') and SUBTIME('2011-12-03 00:00:00', '05:30:00')  GROUP BY shipment.entity_id ) A

Thanks !!! Srinivas Mutyala

Comments

0

Here's a different solution that may be better or worse, depending on your situation:

http://jasonswett.net/how-to-get-headers-when-using-mysqls-select-into-outfile/

Comments

0

I came up with this solution. This way you don't have to know the field names. Very useful if you just want to make quick dumps of tables.

    SET group_concat_max_len = 65533;
    select @target:='sourcetablename'; -- the source table you're going to be dumping
    select @ts:=replace(replace(now(), ':', ' '),' ',' '); -- a time stamp for the file name
    select @csvoutfile:=concat('d:\\\\', @target, @ts, '.csv'); 

--    
-- Pull the field names from the schema
--

    select 
        @field_headers:=GROUP_CONCAT(CONCAT('\'', COLUMN_NAME, '\''))
    from
        INFORMATION_SCHEMA.COLUMNS
    WHERE
        TABLE_NAME = @target
    order BY ORDINAL_POSITION ;

--
--  build a select statement to include the field names and "union all" it with the table 
--  and provide the outfile parameters
--

    select 
        @betterexport:=concat('select ',
                @field_headers,
                ' union all ',
                ' select * from ',
                @target,
                '  ',
                'into outfile \'',
                @csvoutfile,
                '\' FIELDS TERMINATED BY \',\' OPTIONALLY ENCLOSED BY \'"\' LINES TERMINATED BY \'\\n\';');

--
-- prepare and execute the query.
--

    prepare qry1 from @betterexport;
    execute qry1;
    deallocate prepare qry1;
--
-- 
--

Or better still, here's a procedure version so you can just call it at will (ie. "call dump_csv("tablename");"):

DELIMITER $$

CREATE DEFINER=`root`@`localhost` PROCEDURE `dump_csv`(in target varchar(255))
BEGIN
    declare ts varchar(50);
    declare csvoutfile varchar(255);
    declare field_headers varchar(2048);
    declare betterexport varchar(2048);
    DECLARE curs CURSOR FOR select GROUP_CONCAT(CONCAT('\'', COLUMN_NAME, '\''))     from INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = target order BY ORDINAL_POSITION ;

SET group_concat_max_len = 65533; -- max is normally 1024 and will cause problems on tables with a lot of fields

--    
-- Pull the field names from the schema
--
    OPEN curs;
    fetch curs into field_headers;
    CLOSE curs;

    set ts=replace(replace(now(), ':', ' '),' ',' '); -- a time stamp for the file name
    set csvoutfile=concat('d:\\\\', target, ts, '.csv'); -- build filename

--
--  build a select statement to include the field names and "union all" it with the target table 
--  and provide the outfile parameters
--

    set    @betterexport=concat('select ',
                field_headers,
                ' union all ',
                ' select * from ',
                target,
                '  ',
                'into outfile \'',
                csvoutfile,
                '\' FIELDS TERMINATED BY \',\' OPTIONALLY ENCLOSED BY \'"\' LINES TERMINATED BY \'\\n\';');
--
-- prepare and execute the query.
--

    prepare qry1 from @betterexport;
    execute qry1;
    deallocate prepare qry1;
--
-- beer.
--
END

Enjoy! :)

-- Chris

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.