In order for this site to work correctly we need to store a small file (called a cookie) on your computer. Most every site in the world does this, however since the 25th of May 2011, by law we have to get your permission first. Please abandon the forum if you disagree.
Para que este foro funcione correctamente es necesario guardar un pequeño fichero (llamado cookie) en su ordenador. La mayoría de los sitios de Internet lo hacen, no obstante desde el 25 de Marzo de 2011 y por ley, necesitamos de su permiso con antelación. Abandone este foro si no está conforme.
Para que este foro funcione correctamente es necesario guardar un pequeño fichero (llamado cookie) en su ordenador. La mayoría de los sitios de Internet lo hacen, no obstante desde el 25 de Marzo de 2011 y por ley, necesitamos de su permiso con antelación. Abandone este foro si no está conforme.
Large SQL datasets - again
-
- Mensajes: 216
- Registrado: Dom Sep 23, 2007 11:08 pm
Large SQL datasets - again
I'm having problems with how to interface large MySQL datasets. Then I saw
how EMS Sql Manager (www.sqlmanager.net) is doing it, and I think that is
the way to go.
On the attached picture you see the heading of a browse in EMS. The browsed
table contains 800.000+ rows/records. The thing is that EMS automatically
limits the number of rows fetched by using the SELECT ..... LIMIT clause.
The default LIMIT is 1.000 rows/records, so at no point there will be no
more rows/records than the LIMIT in the dataset (the current LIMIT is seen
in the middle of the green SKIP-LIMIT buttons on the picture).
HERE IS THE MAIN POINT:
When setting filters and/or searching for something the WHOLE table is taken
into consideration, not only what is currently in the LIMITed dataset. So
my question is: How can we do that with TDataSet's in Xailer? (i.e. only
have a sub-dataset in the dataset, and do all filtering/searching on the
whole *table*)
I'm using MySQL (v5.0x).
(I've tried f.ex...
oTSQLQuery:Close()
oTSQLQuery:cSelect := "SELECT * FROM HugeTable LIMIT 0,<new limit>"
oTSQLQuery:Open()
....and it works, but the 'old' dataset is obviously not released as the app
in a while eats up all memory on the PC.)
Thanks a lot if anybody could point me in the right direction.
Paal
Attached files
how EMS Sql Manager (www.sqlmanager.net) is doing it, and I think that is
the way to go.
On the attached picture you see the heading of a browse in EMS. The browsed
table contains 800.000+ rows/records. The thing is that EMS automatically
limits the number of rows fetched by using the SELECT ..... LIMIT clause.
The default LIMIT is 1.000 rows/records, so at no point there will be no
more rows/records than the LIMIT in the dataset (the current LIMIT is seen
in the middle of the green SKIP-LIMIT buttons on the picture).
HERE IS THE MAIN POINT:
When setting filters and/or searching for something the WHOLE table is taken
into consideration, not only what is currently in the LIMITed dataset. So
my question is: How can we do that with TDataSet's in Xailer? (i.e. only
have a sub-dataset in the dataset, and do all filtering/searching on the
whole *table*)
I'm using MySQL (v5.0x).
(I've tried f.ex...
oTSQLQuery:Close()
oTSQLQuery:cSelect := "SELECT * FROM HugeTable LIMIT 0,<new limit>"
oTSQLQuery:Open()
....and it works, but the 'old' dataset is obviously not released as the app
in a while eats up all memory on the PC.)
Thanks a lot if anybody could point me in the right direction.
Paal
Attached files
-
- Mensajes: 216
- Registrado: Dom Sep 23, 2007 11:08 pm
Large SQL datasets - again
The best way would be to simulate how we use .DBF's in Clipper where it does
not matter how many records/rows there are in a table. We just don't care,
because somehow the "caching" or LIMIT'ing is done silently in the
background.
Anybody have any idea how to achieve this with SQL in Xailer?
Paal
not matter how many records/rows there are in a table. We just don't care,
because somehow the "caching" or LIMIT'ing is done silently in the
background.
Anybody have any idea how to achieve this with SQL in Xailer?
Paal
-
- Mensajes: 216
- Registrado: Dom Sep 23, 2007 11:08 pm
Large SQL datasets - again
The best way would be to simulate how we use .DBF's in Clipper where it does
not matter how many records/rows there are in a table. We just don't care,
because somehow the "caching" or LIMIT'ing is done silently in the
background.
Anybody have any idea how to achieve this with SQL in Xailer?
Paal
not matter how many records/rows there are in a table. We just don't care,
because somehow the "caching" or LIMIT'ing is done silently in the
background.
Anybody have any idea how to achieve this with SQL in Xailer?
Paal
- ignacio
- Site Admin
- Mensajes: 9469
- Registrado: Lun Abr 06, 2015 8:00 pm
- Ubicación: Madrid, Spain
- Contactar:
Large SQL datasets - again
Paal,
> my question is: How can we do that with TDataSet's in Xailer? (i.e. only
> have a sub-dataset in the dataset, and do all filtering/searching on the
> whole *table*)
I do not know how EMS works, but surely when you do any filtering on the
database a completely new SELECT statement is done. I believe is the way to
go.
Regards,
> ...and it works, but the 'old' dataset is obviously not released as the
> app in a while eats up all memory on the PC.)
Not so obvious to me. Indeed, almost all the dataset memory (except fields
definition) is freed when is closed.
Regards,
--
Ignacio Ortiz de Zúñiga
http://www.xailer.com
"Mahanimann" <paaldalen@gmail.com> escribió en el mensaje
news:[email=471c8f53@ozsrv2.ozlan.local...]471c8f53@ozsrv2.ozlan.local...[/email]
> I'm having problems with how to interface large MySQL datasets. Then I saw
> how EMS Sql Manager (www.sqlmanager.net) is doing it, and I think that is
> the way to go.
>
> On the attached picture you see the heading of a browse in EMS. The
> browsed table contains 800.000+ rows/records. The thing is that EMS
> automatically limits the number of rows fetched by using the SELECT .....
> LIMIT clause. The default LIMIT is 1.000 rows/records, so at no point
> there will be no more rows/records than the LIMIT in the dataset (the
> current LIMIT is seen in the middle of the green SKIP-LIMIT buttons on the
> picture).
>
> HERE IS THE MAIN POINT:
> When setting filters and/or searching for something the WHOLE table is
> taken into consideration, not only what is currently in the LIMITed
> dataset. So my question is: How can we do that with TDataSet's in
> Xailer? (i.e. only have a sub-dataset in the dataset, and do all
> filtering/searching on the whole *table*)
>
> I'm using MySQL (v5.0x).
>
> (I've tried f.ex...
> oTSQLQuery:Close()
> oTSQLQuery:cSelect := "SELECT * FROM HugeTable LIMIT 0,<new limit>"
> oTSQLQuery:Open()
> ...and it works, but the 'old' dataset is obviously not released as the
> app in a while eats up all memory on the PC.)
>
> Thanks a lot if anybody could point me in the right direction.
> Paal
>
>
> my question is: How can we do that with TDataSet's in Xailer? (i.e. only
> have a sub-dataset in the dataset, and do all filtering/searching on the
> whole *table*)
I do not know how EMS works, but surely when you do any filtering on the
database a completely new SELECT statement is done. I believe is the way to
go.
Regards,
> ...and it works, but the 'old' dataset is obviously not released as the
> app in a while eats up all memory on the PC.)
Not so obvious to me. Indeed, almost all the dataset memory (except fields
definition) is freed when is closed.
Regards,
--
Ignacio Ortiz de Zúñiga
http://www.xailer.com
"Mahanimann" <paaldalen@gmail.com> escribió en el mensaje
news:[email=471c8f53@ozsrv2.ozlan.local...]471c8f53@ozsrv2.ozlan.local...[/email]
> I'm having problems with how to interface large MySQL datasets. Then I saw
> how EMS Sql Manager (www.sqlmanager.net) is doing it, and I think that is
> the way to go.
>
> On the attached picture you see the heading of a browse in EMS. The
> browsed table contains 800.000+ rows/records. The thing is that EMS
> automatically limits the number of rows fetched by using the SELECT .....
> LIMIT clause. The default LIMIT is 1.000 rows/records, so at no point
> there will be no more rows/records than the LIMIT in the dataset (the
> current LIMIT is seen in the middle of the green SKIP-LIMIT buttons on the
> picture).
>
> HERE IS THE MAIN POINT:
> When setting filters and/or searching for something the WHOLE table is
> taken into consideration, not only what is currently in the LIMITed
> dataset. So my question is: How can we do that with TDataSet's in
> Xailer? (i.e. only have a sub-dataset in the dataset, and do all
> filtering/searching on the whole *table*)
>
> I'm using MySQL (v5.0x).
>
> (I've tried f.ex...
> oTSQLQuery:Close()
> oTSQLQuery:cSelect := "SELECT * FROM HugeTable LIMIT 0,<new limit>"
> oTSQLQuery:Open()
> ...and it works, but the 'old' dataset is obviously not released as the
> app in a while eats up all memory on the PC.)
>
> Thanks a lot if anybody could point me in the right direction.
> Paal
>
>
Ignacio Ortiz de Zúñiga
[OZ Software]
https://www.ozs.es
--
[Equipo de Xailer / Xailer team]
https://www.xailer.com
[OZ Software]
https://www.ozs.es
--
[Equipo de Xailer / Xailer team]
https://www.xailer.com
Large SQL datasets - again
Paal,
> my question is: How can we do that with TDataSet's in Xailer? (i.e. only
> have a sub-dataset in the dataset, and do all filtering/searching on the
> whole *table*)
I do not know how EMS works, but surely when you do any filtering on the
database a completely new SELECT statement is done. I believe is the way to
go.
Regards,
> ...and it works, but the 'old' dataset is obviously not released as the
> app in a while eats up all memory on the PC.)
Not so obvious to me. Indeed, almost all the dataset memory (except fields
definition) is freed when is closed.
Regards,
--
Ignacio Ortiz de Zúñiga
http://www.xailer.com
"Mahanimann" <paaldalen@gmail.com> escribió en el mensaje
news:[email=471c8f53@ozsrv2.ozlan.local...]471c8f53@ozsrv2.ozlan.local...[/email]
> I'm having problems with how to interface large MySQL datasets. Then I saw
> how EMS Sql Manager (www.sqlmanager.net) is doing it, and I think that is
> the way to go.
>
> On the attached picture you see the heading of a browse in EMS. The
> browsed table contains 800.000+ rows/records. The thing is that EMS
> automatically limits the number of rows fetched by using the SELECT .....
> LIMIT clause. The default LIMIT is 1.000 rows/records, so at no point
> there will be no more rows/records than the LIMIT in the dataset (the
> current LIMIT is seen in the middle of the green SKIP-LIMIT buttons on the
> picture).
>
> HERE IS THE MAIN POINT:
> When setting filters and/or searching for something the WHOLE table is
> taken into consideration, not only what is currently in the LIMITed
> dataset. So my question is: How can we do that with TDataSet's in
> Xailer? (i.e. only have a sub-dataset in the dataset, and do all
> filtering/searching on the whole *table*)
>
> I'm using MySQL (v5.0x).
>
> (I've tried f.ex...
> oTSQLQuery:Close()
> oTSQLQuery:cSelect := "SELECT * FROM HugeTable LIMIT 0,<new limit>"
> oTSQLQuery:Open()
> ...and it works, but the 'old' dataset is obviously not released as the
> app in a while eats up all memory on the PC.)
>
> Thanks a lot if anybody could point me in the right direction.
> Paal
>
>
> my question is: How can we do that with TDataSet's in Xailer? (i.e. only
> have a sub-dataset in the dataset, and do all filtering/searching on the
> whole *table*)
I do not know how EMS works, but surely when you do any filtering on the
database a completely new SELECT statement is done. I believe is the way to
go.
Regards,
> ...and it works, but the 'old' dataset is obviously not released as the
> app in a while eats up all memory on the PC.)
Not so obvious to me. Indeed, almost all the dataset memory (except fields
definition) is freed when is closed.
Regards,
--
Ignacio Ortiz de Zúñiga
http://www.xailer.com
"Mahanimann" <paaldalen@gmail.com> escribió en el mensaje
news:[email=471c8f53@ozsrv2.ozlan.local...]471c8f53@ozsrv2.ozlan.local...[/email]
> I'm having problems with how to interface large MySQL datasets. Then I saw
> how EMS Sql Manager (www.sqlmanager.net) is doing it, and I think that is
> the way to go.
>
> On the attached picture you see the heading of a browse in EMS. The
> browsed table contains 800.000+ rows/records. The thing is that EMS
> automatically limits the number of rows fetched by using the SELECT .....
> LIMIT clause. The default LIMIT is 1.000 rows/records, so at no point
> there will be no more rows/records than the LIMIT in the dataset (the
> current LIMIT is seen in the middle of the green SKIP-LIMIT buttons on the
> picture).
>
> HERE IS THE MAIN POINT:
> When setting filters and/or searching for something the WHOLE table is
> taken into consideration, not only what is currently in the LIMITed
> dataset. So my question is: How can we do that with TDataSet's in
> Xailer? (i.e. only have a sub-dataset in the dataset, and do all
> filtering/searching on the whole *table*)
>
> I'm using MySQL (v5.0x).
>
> (I've tried f.ex...
> oTSQLQuery:Close()
> oTSQLQuery:cSelect := "SELECT * FROM HugeTable LIMIT 0,<new limit>"
> oTSQLQuery:Open()
> ...and it works, but the 'old' dataset is obviously not released as the
> app in a while eats up all memory on the PC.)
>
> Thanks a lot if anybody could point me in the right direction.
> Paal
>
>
- ignacio
- Site Admin
- Mensajes: 9469
- Registrado: Lun Abr 06, 2015 8:00 pm
- Ubicación: Madrid, Spain
- Contactar:
Large SQL datasets - again
Paal,
> The best way would be to simulate how we use .DBF's in Clipper where it
> does
In my opinion, that is your fault. You should stop creating applications
Clipper & DBF alike. Consider to completely eliminate Table browses that
show thousand of records. I suggest you use a QBE approach to limit the
number of rows to show.
Regards,
--
Ignacio Ortiz de Zúñiga
http://www.xailer.com
"Mahanimann" <paaldalen@gmail.com> escribió en el mensaje
news:[email=471c9bf5@ozsrv2.ozlan.local...]471c9bf5@ozsrv2.ozlan.local...[/email]
> The best way would be to simulate how we use .DBF's in Clipper where it
> does not matter how many records/rows there are in a table. We just don't
> care, because somehow the "caching" or LIMIT'ing is done silently in the
> background.
>
> Anybody have any idea how to achieve this with SQL in Xailer?
>
> Paal
>
> The best way would be to simulate how we use .DBF's in Clipper where it
> does
In my opinion, that is your fault. You should stop creating applications
Clipper & DBF alike. Consider to completely eliminate Table browses that
show thousand of records. I suggest you use a QBE approach to limit the
number of rows to show.
Regards,
--
Ignacio Ortiz de Zúñiga
http://www.xailer.com
"Mahanimann" <paaldalen@gmail.com> escribió en el mensaje
news:[email=471c9bf5@ozsrv2.ozlan.local...]471c9bf5@ozsrv2.ozlan.local...[/email]
> The best way would be to simulate how we use .DBF's in Clipper where it
> does not matter how many records/rows there are in a table. We just don't
> care, because somehow the "caching" or LIMIT'ing is done silently in the
> background.
>
> Anybody have any idea how to achieve this with SQL in Xailer?
>
> Paal
>
Ignacio Ortiz de Zúñiga
[OZ Software]
https://www.ozs.es
--
[Equipo de Xailer / Xailer team]
https://www.xailer.com
[OZ Software]
https://www.ozs.es
--
[Equipo de Xailer / Xailer team]
https://www.xailer.com
Large SQL datasets - again
Paal,
> The best way would be to simulate how we use .DBF's in Clipper where it
> does
In my opinion, that is your fault. You should stop creating applications
Clipper & DBF alike. Consider to completely eliminate Table browses that
show thousand of records. I suggest you use a QBE approach to limit the
number of rows to show.
Regards,
--
Ignacio Ortiz de Zúñiga
http://www.xailer.com
"Mahanimann" <paaldalen@gmail.com> escribió en el mensaje
news:[email=471c9bf5@ozsrv2.ozlan.local...]471c9bf5@ozsrv2.ozlan.local...[/email]
> The best way would be to simulate how we use .DBF's in Clipper where it
> does not matter how many records/rows there are in a table. We just don't
> care, because somehow the "caching" or LIMIT'ing is done silently in the
> background.
>
> Anybody have any idea how to achieve this with SQL in Xailer?
>
> Paal
>
> The best way would be to simulate how we use .DBF's in Clipper where it
> does
In my opinion, that is your fault. You should stop creating applications
Clipper & DBF alike. Consider to completely eliminate Table browses that
show thousand of records. I suggest you use a QBE approach to limit the
number of rows to show.
Regards,
--
Ignacio Ortiz de Zúñiga
http://www.xailer.com
"Mahanimann" <paaldalen@gmail.com> escribió en el mensaje
news:[email=471c9bf5@ozsrv2.ozlan.local...]471c9bf5@ozsrv2.ozlan.local...[/email]
> The best way would be to simulate how we use .DBF's in Clipper where it
> does not matter how many records/rows there are in a table. We just don't
> care, because somehow the "caching" or LIMIT'ing is done silently in the
> background.
>
> Anybody have any idea how to achieve this with SQL in Xailer?
>
> Paal
>
-
- Mensajes: 216
- Registrado: Dom Sep 23, 2007 11:08 pm
Large SQL datasets - again
> I suggest you use a QBE approach to limit the number of rows to show.
I try, but are not happy with that 'indirect' way. I have been looking at
other applications to see how they are doing it, but don't find their way
better, until I saw EMS.
Microsoft Navision is my goal, it handles tables the .DBF-way, no matter the
size of the table. Navision used to use a proprietary database server, now,
after MS bought Navision, it is MS SQL Server, but the table approach is to
my knowledge the same.
Ok, just my thoughts. Thanks for your answer.
Paal
I try, but are not happy with that 'indirect' way. I have been looking at
other applications to see how they are doing it, but don't find their way
better, until I saw EMS.
Microsoft Navision is my goal, it handles tables the .DBF-way, no matter the
size of the table. Navision used to use a proprietary database server, now,
after MS bought Navision, it is MS SQL Server, but the table approach is to
my knowledge the same.
Ok, just my thoughts. Thanks for your answer.
Paal
-
- Mensajes: 216
- Registrado: Dom Sep 23, 2007 11:08 pm
Large SQL datasets - again
> I suggest you use a QBE approach to limit the number of rows to show.
I try, but are not happy with that 'indirect' way. I have been looking at
other applications to see how they are doing it, but don't find their way
better, until I saw EMS.
Microsoft Navision is my goal, it handles tables the .DBF-way, no matter the
size of the table. Navision used to use a proprietary database server, now,
after MS bought Navision, it is MS SQL Server, but the table approach is to
my knowledge the same.
Ok, just my thoughts. Thanks for your answer.
Paal
I try, but are not happy with that 'indirect' way. I have been looking at
other applications to see how they are doing it, but don't find their way
better, until I saw EMS.
Microsoft Navision is my goal, it handles tables the .DBF-way, no matter the
size of the table. Navision used to use a proprietary database server, now,
after MS bought Navision, it is MS SQL Server, but the table approach is to
my knowledge the same.
Ok, just my thoughts. Thanks for your answer.
Paal
-
- Mensajes: 216
- Registrado: Dom Sep 23, 2007 11:08 pm
Large SQL datasets - again
>> my question is: How can we do that with TDataSet's in Xailer? (i.e. only
>> have a sub-dataset in the dataset, and do all filtering/searching on the
>> whole *table*)
> I do not know how EMS works, but surely when you do any filtering on the
> database a completely new SELECT statement is done. I believe is the way
> to go.
Yes, either a new SELECT statement or using a prepared statement in the
oDataSet:cSelect property. That brings up the following questions:
1: USING NEW SELECT STATEMENTS:
Using MySQL, I presume the best dataset to use is TSQLQuery?
This works, but does not release any memory (the 'old' datasets memory
consumption is not freed) so per definition it does not work:
::oTSQLQueryDataSet:Close()
::oTSQLQueryDataSet:cSelect:= <new SELECT statement>
::oTSQLQueryDataSet:Open()
::oTSQLQueryDataSet:Refresh()
The memory issue is the same as described in another thread I started,
"Memory hogging with TSQLQuery".
So, the question: What is the proper way to issue new SELECT statements in
a TSQLQuery (or the TDataSet you think is best suited)?
2: USING PREPARED SELECT STATEMENTS
Is that possible in the TSQLQuery:cSelect property (or in the TDataSet you
think is best suited)?
Thanks,
Paal
>> have a sub-dataset in the dataset, and do all filtering/searching on the
>> whole *table*)
> I do not know how EMS works, but surely when you do any filtering on the
> database a completely new SELECT statement is done. I believe is the way
> to go.
Yes, either a new SELECT statement or using a prepared statement in the
oDataSet:cSelect property. That brings up the following questions:
1: USING NEW SELECT STATEMENTS:
Using MySQL, I presume the best dataset to use is TSQLQuery?
This works, but does not release any memory (the 'old' datasets memory
consumption is not freed) so per definition it does not work:
::oTSQLQueryDataSet:Close()
::oTSQLQueryDataSet:cSelect:= <new SELECT statement>
::oTSQLQueryDataSet:Open()
::oTSQLQueryDataSet:Refresh()
The memory issue is the same as described in another thread I started,
"Memory hogging with TSQLQuery".
So, the question: What is the proper way to issue new SELECT statements in
a TSQLQuery (or the TDataSet you think is best suited)?
2: USING PREPARED SELECT STATEMENTS
Is that possible in the TSQLQuery:cSelect property (or in the TDataSet you
think is best suited)?
Thanks,
Paal
-
- Mensajes: 216
- Registrado: Dom Sep 23, 2007 11:08 pm
Large SQL datasets - again
>> my question is: How can we do that with TDataSet's in Xailer? (i.e. only
>> have a sub-dataset in the dataset, and do all filtering/searching on the
>> whole *table*)
> I do not know how EMS works, but surely when you do any filtering on the
> database a completely new SELECT statement is done. I believe is the way
> to go.
Yes, either a new SELECT statement or using a prepared statement in the
oDataSet:cSelect property. That brings up the following questions:
1: USING NEW SELECT STATEMENTS:
Using MySQL, I presume the best dataset to use is TSQLQuery?
This works, but does not release any memory (the 'old' datasets memory
consumption is not freed) so per definition it does not work:
::oTSQLQueryDataSet:Close()
::oTSQLQueryDataSet:cSelect:= <new SELECT statement>
::oTSQLQueryDataSet:Open()
::oTSQLQueryDataSet:Refresh()
The memory issue is the same as described in another thread I started,
"Memory hogging with TSQLQuery".
So, the question: What is the proper way to issue new SELECT statements in
a TSQLQuery (or the TDataSet you think is best suited)?
2: USING PREPARED SELECT STATEMENTS
Is that possible in the TSQLQuery:cSelect property (or in the TDataSet you
think is best suited)?
Thanks,
Paal
>> have a sub-dataset in the dataset, and do all filtering/searching on the
>> whole *table*)
> I do not know how EMS works, but surely when you do any filtering on the
> database a completely new SELECT statement is done. I believe is the way
> to go.
Yes, either a new SELECT statement or using a prepared statement in the
oDataSet:cSelect property. That brings up the following questions:
1: USING NEW SELECT STATEMENTS:
Using MySQL, I presume the best dataset to use is TSQLQuery?
This works, but does not release any memory (the 'old' datasets memory
consumption is not freed) so per definition it does not work:
::oTSQLQueryDataSet:Close()
::oTSQLQueryDataSet:cSelect:= <new SELECT statement>
::oTSQLQueryDataSet:Open()
::oTSQLQueryDataSet:Refresh()
The memory issue is the same as described in another thread I started,
"Memory hogging with TSQLQuery".
So, the question: What is the proper way to issue new SELECT statements in
a TSQLQuery (or the TDataSet you think is best suited)?
2: USING PREPARED SELECT STATEMENTS
Is that possible in the TSQLQuery:cSelect property (or in the TDataSet you
think is best suited)?
Thanks,
Paal
-
- Mensajes: 216
- Registrado: Dom Sep 23, 2007 11:08 pm
Large SQL datasets - again
>> ...and it works, but the 'old' dataset is obviously not released as the
>> app in a while eats up all memory on the PC.)
> Not so obvious to me. Indeed, almost all the dataset memory (except fields
> definition) is freed when is closed.
It is not. In addition, no memory is freed when changing the :cSelect
property and doing a :Refresh(). Please look at the new thread "Memory
hogging with TSQLQuery".
Thanks,
Paal
>> app in a while eats up all memory on the PC.)
> Not so obvious to me. Indeed, almost all the dataset memory (except fields
> definition) is freed when is closed.
It is not. In addition, no memory is freed when changing the :cSelect
property and doing a :Refresh(). Please look at the new thread "Memory
hogging with TSQLQuery".
Thanks,
Paal
-
- Mensajes: 216
- Registrado: Dom Sep 23, 2007 11:08 pm
Large SQL datasets - again
>> ...and it works, but the 'old' dataset is obviously not released as the
>> app in a while eats up all memory on the PC.)
> Not so obvious to me. Indeed, almost all the dataset memory (except fields
> definition) is freed when is closed.
It is not. In addition, no memory is freed when changing the :cSelect
property and doing a :Refresh(). Please look at the new thread "Memory
hogging with TSQLQuery".
Thanks,
Paal
>> app in a while eats up all memory on the PC.)
> Not so obvious to me. Indeed, almost all the dataset memory (except fields
> definition) is freed when is closed.
It is not. In addition, no memory is freed when changing the :cSelect
property and doing a :Refresh(). Please look at the new thread "Memory
hogging with TSQLQuery".
Thanks,
Paal