Performance of JDBC connector for arrays
I'd like to note that in the case you have to export rather big volumes of data containing arrays JDBC may not be the best choice. Firstly, its performance degrades due to the overhead caused by a conversion of native Matlab arrays into org.postgresql.jdbc.PgArray objects. Secondly, this may lead to a shortage of Java heap memory (and simply increasing Java heap memory size may not be a panacea). Both these points can be seen on the following picture illustrating the performance of datainsert method from Matlab Database Toolbox (it works with PostgreSQL exactly through a direct JDBC connection):

The blue graph displays the performance of batchParamExec command from PgMex library (see https://pgmex.alliedtesting.com/#batchparamexec for details). The endpoint of the red graph corresponds
to a certain maximum data volume passed into the database by datainsert without any error.
A data volume greater than that maximum causes “out of Java heap memory” problem
(Java heap size is specified at the top of the figure).
For further details of experiments please see the following
paper with full benchmarking results for data insertion.
Example reworked
As can be seen PgMex based on libpq (the official C application programmer's interface to PostgreSQL) has greater performance and able to process volumes at least up to more than 2Gb.
Using this library your code can be rewritten as follows (we assume below that all the parameters marked by <> signs are properly filled, that the table neuron already exists in the database and have fields classifier_id of int4, threshold of float8, weights of float8[] and neuron_num of int4 and, at last, that the variables classfierIdVec, output_thresholds, outputnodes and neuronNumVec are already defined and are numerical arrays of sizes shown in the comments in the code below; in the case the types of table fields are different you need to appropriately fix the last command of the code):
% Create the database connection
dbConn = com.allied.pgmex.pgmexec('connect',[...
'host=<yourhost> dbname=<yourdb> port=<yourport> '...
'user=<your_postgres_username> password=<your_postgres_password>']);
insertCommand = ['INSERT INTO neuron '...
'(classifier_id, threshold, weights, neuron_num) VALUES ($1,$2,$3,$4)'];
SData = struct();
SData.classifier_id = classifierIdVec(:); % [nTuples x 1]
SData.threshold = output_thresholds(:); % [nTuples x 1]
SData.weights = outputnodes; % [nTuples x nWeights]
SData.neuron_num = neuronNumVec; % [nTuples x 1]
com.allied.pgmex.pgmexec('batchParamExec',dbConn,insertCommand,...
'%int4 %float8 %float8[] %int4',SData);
It should be noted that outputnodes needs not to be cut along rows on separate arrays because the latter ones
are of the same length. In the case of arrays for different tuples having different sizes it is necessary to pass them
as a column cell array with each cell containing its own array for each tuple.
EDIT: Currently PgMex has free academic licensing.
float8is a valid type; its the internal alias fordouble precision. You can actually store your matrix directly in PostgreSQL as a 2-dimensional array, too.