The difference in this case from the previous cases is the length of each string the the Input-cell is the same. One of the previous questions is in different length, another case just transform from one string into a matrix
I have 3 sequences in a cell-array:
Input_cell= {'ABCD','ACDB', 'BCAD'}
S1= 'ABCD' % which means A<B<C<D
S2= 'ACDB' % which means A<C<D<B
S3= 'BCAD' % which means B<C<A<D
I want to convert each of the strings in the Input_cell into a matrix M (i-by-j) which has to satisfy these conditions:
M(i,j) and M(j,i) are random
M(i,i) = 0.5
M(i,j) + M(j,i) = 1
M(i,j) < M(j,i) For example if A<B then M(A,B) < M(B,A)
%For example if we have S1 = 'ABCD' (which means A<B<C<D), the M1 matrix will be expected as follows:
A B C D
A 0.5 0.3 0.2 0.1
B 0.7 0.5 0 0.4
C 0.8 1 0.5 0.1
D 0.9 0.6 0.9 0.5
%If we have S2 = 'ACDB' (which means A<C<D<B), the M2 matrix will be expected as follows:
A B C D
A 0.5 0.3 0.2 0.1
B 0.7 0.5 0.6 0.8
C 0.8 0.4 0.5 0.1
D 0.9 0.2 0.9 0.5
% If we have S3 = 'BCAD' (which means B<C<A<D), the M3 matrix will be expected as follows:
A B C D
A 0.5 0.6 0.7 0.1
B 0.4 0.5 0.2 0.3
C 0.3 0.8 0.5 0.1
D 0.9 0.7 0.9 0.5
How to create that kind of above matrices from a given cell-array of sequences?