I was able to do a small amount of analysis on Smart Morph data storage. It's not much but it's a start.
The 32x32 cell color screen image after a learn is stored in PNG standard format. In a bulk dump of a performance it will be the contents of the '06 00 11 00' (FM-X Smart Morph PNG Data) or '06 00 21 00' (AN-X Smart Morph PNG Data) table shown on p.217 of the Data List doc.
In the binary User file (*.Y2U on an M) the PNG image will be in the DSPG data area
44 53 50 47 00 00 03 9C 00 00 00 01 44 61 74 61
1st 4 are ASCII 'DSPG' and last 4 are 'Data'
The '00 00 03 9C' is the remaining length of the data and the '00 00 00 01' indicates only 1 set of data
00 00 03 90 89 50 4E 47 0D 0A 1A 0A 00 00 00 0D
This is the start of the 'data' chunk and the '00 00 03 90' is the remaining length - it is '0C' bytes shorter than the total length
The '89 50 4E 47 0D 0A 1A 0A' is the standard PNG header. The '00 00 00 0D' is the size of the following IHDR chunk.
49 48 44 52 00 00 00 20 00 00 00 20 08 06 00 00
'49 48 44 52' is 'IHDR' the required first chunk. the two '00 00 00 20' are the width (32) and height (32) of the image.
The '08' is the bit depth, '06' the color type, '00' the compression method and '00' the filter method.
There are plenty of online references for the PNG format: -h-t-t-p-:-/-/-w-w-w-.-l-i-b-p-n-g-.org/pub/png/spec/1.2/PNG-Chunks.html -- remove the hyphens added so link would post
A bulk dump of a Morph performance will have a (p.217) '06 00 10 00' Smart Morph Data File. This is a proprietary format of data defining the SOM that is used when you use 'Learn'. It is the AI engine data and doesn't contain the actual result data.
There is a DSOM 'data' chunk in a user/library file that contains the almost 1 million bytes that represent the result of the learn process. As far as I can tell this is fixed size regardless of the number of parameters actually used in the Morph. I examined the data from the Hydra Drone performance and a simple morph where parts 9 and 10 were nothing more than Init Normal (FM-X) with different Alternate Pan settings.
As expected the color map after the Learn only contained two colors since only the pan parameter was different between the two parts. But the resulting Morph data took the same amount of space in the user file: '0D B1 53' bytes or 897,363 bytes.
A 32x32 grid is 1024 cells. So if each cell was only 1 byte (0-127) that could max out at 876 parameter values. It appears that each parameter in the result is using a separate grid so that each cell you select has a value for that parameter. So if you select the cell at Row 3, Column 4 the code will get each parameters value from the same cell location in that parameters grid.
When you select parameters for the 3 colors you can choose from:
Common 1 - 34 parameters
Common 2 - 35 parameters
Op 1 - 35 parameters
. . .
Op 8 - 35 parameters
A total of 35 * 8 = 280 + 69 = 349 possible parameters. If each parameter used 2 bytes the data block could still handle 438 parameters grids.
SUMMARY:
1. The user has NO ability to select or restrict the parameters that will be used in the Morph. There is an IdeaScale suggestion (please read and vote on it) asking Yamaha to add that functionality. That would be extremely useful in finding/designing new sounds since you could control what is being morphed, save those results, and then add a new piece to morphing and gradually build a sound up.
2. The parameters that the user selects for Red, Green and Blue do nothing more than add color to the grid cells for those three parameters. So if you 'cutoff' for Green then each of the 32x32 cells will have a shade of green reflecting the value of the 'cutoff' parameter for that cell. The Green color only let you visualize how the 'cutoff' level varies throughout the entire grid so that if you want a higher cutoff in the sound you would select a cell that is a darker green.
3. The Morph data doesn't appear to be compressed but appears to be of a fixed size regardless of the number of parameters that are actually morphed but that could handle all of the parameters available for morphing assuming they were all morphed simultaneously. Most likely the Learn process is smart enough to not actually spend time morphing values that don't change and just stores the common value in each cell of the grid for that parameter.
4. I haven't been able to tell if there is actually an internal limit to the number of parameters that actually take part in the morphing. There are at least those 349 above that COULD take part but the AI engine could just restrict it to 100 or some other number.
5. You can NOT use a bulk dump to fully restore a morph performance since it doesn't contain the resulting morph data. It does contain the AI engine data needed to do a learn. Not sure it would make sense to try to restore that but you could.
Interesting new info - maps have space for 876 parameters!
In my initial post I mentioned this:
A 32x32 grid is 1024 cells. So if each cell was only 1 byte (0-127) that could max out at 876 parameter values.
Analysis of the data in the DSOM section of a morph performance shows that each section contains EXACTLY 876 bytes and that there are 1024 (32x32k) such sections total.
This means that a section of 876 bytes has values for ALL the parameters. So the data storage acts as if it is arranged as a 2-dimensional matrix: Row X Column. Selecting a row or column, manually or via the super knob, identifies a single cell and thus the block of 876 bytes containing the new 1 byte value for each parameter.
Then a simple mapping table could then be used to update the real parameters with the values from the data block.
The above structure organizes the data to be about as efficient as possible for both the initial population of values by the AI SOM engine and for the real-time extraction/update of so many parameters.
The parameter count I provided earlier of 349 was only for an FM-X morph. There are 118 parameters listed in the RED, GREEN, BLUE dropdowns for AN-X. If these are stored uniquely that would use 467 (349 + 118) of the 876 cells in the data block of parameter values.
But it seems the capacity is already in place to expand the destination space from the current 349 to many more. Still have some more analysis to do to see if I determine if there is a pattern to the way specific parameters are assigned within the block of 876 bytes. For example are the parameters for op1 in sequential cells? Are the parameters for the ops (1-8) stored together? Are the common parameters stored together?
More progress - the 1024 blocks of 876 bytes each are stored in Column and Row order. So the 1st block represents the cell at C1/R1 of the graphic, the 2nd block is C1/R2, C1/R3,...,C1/R32, C2/R1, C2/R2 and so on.
I was able to write simple Java code to transform the data to put all of the data for each parameter together serially. That produced 1024 values for each parameter as a sequence of bytes.
By setting the morph super knob start and end positions to the top left and bottom left corners I was able to observe the Operator 8 level values changing as I moved the Super Knob.
I was then able to find that exact sequence of values in the transformed data to identify where those parameters were stored.
That seems to confirm what I hypothesized earlier that there is a one byte grid of 1024 (32x32) cells for every parameter whether the parameter is used in the morph or not. Parameters not used in the morph have the same value for all 1024 cells.