Code: Select all
fs := TFileStream.Create('C:\LocalWork\a.bin', fmCreate);
Try
ZMemTable1.SaveToStream(fs);
Finally
FreeAndNil(fs);
End;
ZMemTable1.Close;
fs := TFileStream.Create('C:\LocalWork\a.bin', fmOpenRead);
Try
ZMemTable1.LoadFromStream(fs);
Finally
FreeAndNil(fs);
End;
Please check, TMemoField should be a child of TBlobField. Otherwise the code wouldn't even end up in that branch; as there should be a check like this before:
Code: Select all
If Self.Fields[b] Is TBlobField Then [...]
It would help to be able to debug these somewhere. I'm a bit confused, as if the code works on D7, it should really work on D2006 as well. I'd personally prefer the VPN solution - easy to implement and provides considerable security.
As for the format I implemented an import - export functionality in my DB client not that long ago. Currently binary, TSV and CSV is supported, but what I wanted to say is; if we know the format, it can be done.
Also, I sent an other pull request with the latest version of my ZMemTable.pas; which was tested on D7. Since GitHub sync is off, the first two commits are included but they can be ignored :)
The size - unfortunately it's out of my control. The issue is, if a string field is set to have the size of 200 and it contains only a letter 'a', GetData returns the full 200 bytes.
There's a commented out section in the code which strips the trailing zeroes to reduce size but it messes something up in numeric fields.
Stream Read error usually means that there is nothing else to read. Are you sure that the stream is reset to position 0 before LoadFromStream is called? Can you reproduce the issue with TMemoryStream?