blob field

Forum related to version 6.5.1 (alpha) and 6.6.x (beta) of ZeosLib's DBOs

Moderators: gto, cipto_kh, EgonHugeist

Post Reply
pbturner
Fresh Boarder
Fresh Boarder
Posts: 8
Joined: 04.12.2006, 15:23

blob field

Post by pbturner »

Have an application written in C++ that currently writes 19k worth of
formatted data to a file. Need to have a similar function that can
write (and read) the data to (and from) a blob field in our firebird database. Current mechanism is numerous fprintf calls to output the data to an fopen'ed file.

is it easier to write the data to a temporary file, then read it back into the
blob field in a single binary read operation? or is there a similar mechanism to the multiple fprintf's we're currently using?

any hints/pointers/examples to help me move along this winding road?

TIA

Pete Turner
Sun Nuclear Corp.
Bottleneck
Fresh Boarder
Fresh Boarder
Posts: 6
Joined: 09.08.2006, 09:48

Solution for BlobError while writing Blobs with size 18K

Post by Bottleneck »

Hello pbturner

Had the same problem while writing BLOBs grater 18k to Firebird.
Postet that probem 2005. No Fix until now.

My solution worked for the last two years without problems.

Look for Procedure "ReadBlobBufer" in File "ZDbcInterbase6Utils.pas"

Insert following two lines before line: " { Copies data to blob buffer }"


if (Size > SegmentLenght) then
SegmentLenght := size;


See attached file.

Let me know if this solution solves your problem.

Greetings from Germany
Peter
You do not have the required permissions to view the files attached to this post.
User avatar
mdaems
Zeos Project Manager
Zeos Project Manager
Posts: 2766
Joined: 20.09.2005, 15:28
Location: Brussels, Belgium
Contact:

Post by mdaems »

Does Firebird allow unlimited segmentsizes? If so this solves the problem. Otherwise this could cause trouble when using bigger files. You actually tell the program it should always use segments as large as your blobsize.
Do you know if maximum segmentsize has changed over time(interbase + firebird versions)? Even if the sky is the limit I think it's better to use chunks of a reasonable maximum size to avoid memory waste. I think it's better to find the real problem here.
I see you posted this problem before. Can you please post it again, but in the new bug tracker this time. We had to drop all old bug reports because nobody of the (new) team knew what was their status. Problems at takeover.
Can you provide a small sample program that demonstrates the problem? (zip together program, sample data and schema creation sql, please)
Besides : this is a very specific problem which means we need somebody who knows how blobs work (I don't) and who uses firebird. I don't know anybody with this specifications (unless cipto_kh also uses a lot of blobs)

Mark
pbturner
Fresh Boarder
Fresh Boarder
Posts: 8
Joined: 04.12.2006, 15:23

Post by pbturner »

hi Peter (and Mark)

not sure what happened with the attached file, but i only got 1 page worth of code. ended 5 lines after the changes you said you made. based on what i can remember from my pascal days, i think there's more to the file that is missing.

as a newbie to blobbies, i would welcome any suggestions as to how to proceed with this project, either as a single segment, or multiple segments. preferrably multiple, as i can see the file size growing and i think the limit for firebird is 32k (at least with 1.5.3).

had the reference to how we currently did it simply to indicate that there's a lot of formatting that takes place to get to the "output file" format. since the reading and writing of this file (and it's format) are already developed and deployed, i wanted to make use of this by simply placing the data into a single field in our database. since our application uses all or none of the data, it made more sense to use a blob, rather then over 1000 individual fields. again, that's why i referenced the writing to a file and then reading back in. figured i could write to a temporary file and then read it back into a blob field in my data record in the database. so far, with a lot of testing this and that, i've gotten almost to the point of getting the data into the database. have set up a TStream using ZQuery->CreateBlobStream to point to the blob, and a TFileStream pointing to the file. have tried the CopyFrom method of the BlobStream to try to get the data into the data field. it appears to have worked, as no errors, but when i check the database, the field still indicates NULL. yes, i do a commit after. not sure if this is the currect way to do it, or to do a segmented blob. again, any help/support/guidance would be greatly appreciated.

TIA,

pbt.
Post Reply