I would appreciate some suggestions with clustering the rows of a binary matrix.
Basically given a binary vector (a generic row from matrix Bb) which contains 
an unknown number of 1s mixed with 0s, the problem is to define clusters 
according to the vague and incomplete description given in a paper:.
 " Determine the clusters  U1 ,U2 ,....,U c  for each scale (matrix  row) in 
the binary matrix Bb separately. The clustering is performed scale by scale 
without overlapping across the scales (rows) such that each cluster contains 0s 
and a 1 positioned near the center of the cluster. If the whole row of the 
matrix Bb doesn't have any 1s, then all coefficients in that scale are assumed 
to form just one cluster. The end result of all this is a cluster template that 
is used for feature extraction in .."
They ignore cases like:
1. a row of matrix Be containing a 1 on either end (first and/or last row 
element)
2. a row of matrix Be containing two or more adjacent 1s (no 0 in-between)

Thak you  so much in advance,
Maura




tutti i telefonini TIM!


        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to