Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Introduction to Datasets

module

Module

The Dataset Module is designed for data exchange with SQL databases and text files from a diverse set of sources. This module offers compatibility with a wide array of database technologies, including ADO.NET, ODBC, OleDB, and native interfaces, providing straightforward configuration with prominent databases such as SQL Server, Oracle, SQLite, and PostgreSQL.

Features include multi-threaded concurrent connections for efficient data handling, a dedicated SQLite editor for streamlined database management, and an intuitive visual query builder that simplifies the process of query creation and manipulation. Moreover, the Dataset module is equipped to integrate real-time tags within query strings, enhancing the dynamism of data interaction. It can also manage files and recipes in ASCII, Unicode, or XML formats, broadening its applicability across different data types. 

On this page:

Table of Contents
maxLevel3
stylenone


Purpose and Key Concepts

The purpose of the Dataset module in FrameworX is to facilitate efficient data exchange between the platform and various external databases and text files, thereby creating a robust data infrastructure that empowers other real-time modules to function effectively.

Dataset DBs

In order You need to create a connection for the Dataset module Datasets Module to communicate with an external database, a connection must be created with certain parameters. These connections, which are created within the Datasets → DBs section of the module, are referred to as Dataset DBsexternal databases. You can create these connections, called Dataset DBs, by accessing Datasets → DBs in the platform.

Dataset Queries

In the context of the Dataset moduleModule, a Dataset Query refers not only to an SQL query string , but also to a Project object that has a logical name, an SQL query related to that logical name, and other parameters defined within the the Datasets → Queries section. There are many ways to automatically map the results of a query execution with Tags.

Dataset TablesDataset Tables

A Dataset Table is a logical name that is created within a project solution to set up access to a specific table in a connected database. These tables are listed within the in Datasets → Tables section of the module. The Tags in the real-time database can easily be mapped to columns in the tables to perform insert, update, or read operations.

Dataset Files

A Dataset File is a logical name that defines parameters for reading and writing files in ASCII, Unicode, or XML formats. To manage the Dataset Files, go to to Datasets → Files. 


Using Tags in

Understanding the

Dataset module

In the Dataset module, you can utilize Tags from your project as dynamic placeholders within SQL queries or file names. This capability facilitates real-time data manipulation within the query strings. Moreover, you can map the results of the queries to Tags, or employ Tags to populate values when inserting or modifying existing records. 

Datasets module

The Datasets Module enables users to interact with SQL databases seamlessly, allowing them to retrieve and manipulate data within their applications. With this module, users can generate custom visualizations using Displays, implement intricate logic using Scripts, and set up dynamic Alarms based on database information. Additionally, the module supports real-time Tags within query strings, adding flexibility to data interaction. It further extends its functionality by managing files and recipes in ASCII, Unicode, or XML formats, making it versatile across various data types.

Feature Highlights

  • Multi-threaded concurrent connections for efficient data handling.
  • Dedicated SQLite editor for streamlined database management.
  • Intuitive visual query builder that simplifies the process of query creation and manipulation.

Understanding the Datasets module

Overview

The Datast modules allows easy ashes to SQL database bases, so you exchange values for Tables and query its twitch tags in the a[p[plica,

Data Utilization

The data retrieved from databases can be utilized in various ways throughout your project:

  • In the Displays Module: Visualization tools like DataGrids can present query results on screens and dashboards, creating custom views of the data that are accessible and easy to understand for your users.
  • In the Scripting Module: Custom scripts can reference query results and trigger specific actions, such as sending notifications, updating tags, or performing calculations, thereby implementing complex logic based on database data.
  • In the Alarm Module: Data can be used to configure or update alarm conditions, triggers, and actions, enabling the creation of dynamic alarm systems that respond to data changes.

In a similar fashion, various modules can add data to your SQL databases. For instance, the Displays Module can log operations inputs and commands, the Scripting Module can calculate analytics, and the Alarm and Historian Module can retain long-term records and audit trails. Essentially, the Dataset module drives bi-directional real-time communication between all Project Modules and the SQL databases.

Processing Data Requests

  • Datasets Data Server Service

The Datasets Data Server Service forms an integral part of the Datasets module, ensuring high performance and seamless integration while exchanging structured collections of data in real-time. It efficiently processes data requests from other modules, enabling connectivity with various components such as HMI/SCADA screens, scripting, and reporting tools.

  • Default Databases Used when Running the Project

The Dataset module also serves as a data storage configuration hub for other modules. The Alarm and Historian modules, which generate events and records for long-term retention, utilize the Dataset module for defining data storage. Similarly, the Security Module can utilize user definitions from an external database defined within the Dataset module.

Data Source Virtualization

Data Source Virtualization is an advanced feature that simplifies data management across multiple data sources. It provides a unified interface for accessing, querying, and manipulating data, regardless of the underlying data storage technology. This feature allows modifications to the underlying data sources repositories without impacting the rest of the application.

  • Agnostic, Standards, Centralized Management

Adhering to industry standards, the module is designed to work seamlessly with various data storage technologies. It enables centralized management of data connections and configurations, streamlining the process of integrating different data sources into your projects.

For a more detailed explanation Access the Advanced Datasets Topics page for more details of how the Dataset module works, please refer to the page Advanced Dataset Guide.


Configuring the Dataset module

Configuration Workflow

The typical configuration workflow for the Dataset module

has the following sequence:

is as follows:

Establish Database Connections:

  1. Navigate to Datasets → DBs to create essential database connections.
  2. Gather connection details for project databases.
  3. Leverage the built-in SQLite database for temporary development purposes.

Prepare Queries:

  1. Access Datasets → Queries to craft queries using the Visual Query Editor or provided SQL strings.
  2. Assign logical names (Dataset.Query) to identify and streamline query usage.
  3. Implement a virtualization model for seamless transition to the production database.

Modify Queries for Real-time Interaction:

  1. Fine-tune queries at Datasets → Queries by adding real-time parameters.
  2. Example: Transform "WHERE col1 = 5" to "WHERE col1 = {{tag.Test}}".

Manage Project Tables:

  1. Navigate to Datasets → Tables for direct management of database tables.
  2. Connect inserted data with project Tags for efficient data handling.
  3. Utilize tables directly when all necessary information resides in a single table.

Configure Stored Procedures:

  1. Extend functionality at Datasets → Queries by configuring and executing Stored Procedures.
  2. Utilize the same interface for defining Stored Procedures as for queries.

Exchange Data with Files:

  1. First, access Datasets → DBs and establish a database connection. After creating a DB, input the server path in the Connection String property.
  2. Access Datasets → Queries to create Queries using the visual Query Editor or Provided SQL Strings.

    Tip
    titleUse Meaningful Names

    We recommend you assign logical names, such as Dataset.Query, to identify Query Usage, improving the solution maintenance.


    Info
    titleEnhance Query Flexibility with Real-Time Parameters

    Adding real-time parameters to your queries offers flexibility and adaptability, enabling dynamic data retrieval based on changing conditions. Instead of hardcoding specific values, you can use dynamically populated placeholders at runtime. 

    Instead of using Queries like the following:

    • SELECT * FROM your_table WHERE col1 = 5

    Use Queries with real-time parameters:

    • SELECT * FROM your_table WHERE col2 = {{tag.Test}} 

    In this example, {{tag.Test}} serves as a placeholder for a real-time parameter.


  3. Access Datasets → Tables to manage your project tables and connect inserted data with Tags for efficient data handling.
  1. Establish file-based data exchange configurations at Datasets → Files.
Configure
  1. You can configure interactions with plain text or XML files as needed.

Utilize Logical Objects:

  • Leverage logically named objects (Queries, Tables, Files) throughout the project.
  • Examples: Apply
    1. Finally, leverage logically named objects throughout the solution, applying them in script calculations, visualization displays, and other
    project
    1. components.


    Info

    You can also configure and execute Stored Procedures at Datasets → Queries to extend the Query's functionality.


    Anchor
    Connections
    Connections
    Managing DB Connections

    When using SQLite databases, the Module Dataset can automatically create the Database localy if they don't already exist; for other database types, the database itslef must already exist before you set your connection.

    Info
    Users with any Permission groups

    Any user can create new connections in the

    Project

    solution. However,

    but

    only the Administrator can configure

    databases password logins

    database login passwords.

    There are four database

    connection

    connections already created in any new

    Project:

    solution. The table below lists them.

    Datasets DB - Pre-defined database connections

    DB

    Database

    Path Location

    Usage

    Retentive

    SQLite

    <ProjectNameAndPath>.dbRetentive

    Stores values for Retentive Tags.

    RuntimeUsers

    SQLite

     <ProjectNameAndPath>.dbRuntimeUsers

    Stores dynamically created Users.

    AlarmHistorian

    SQLite

     <ProjectNameAndPath>.dbAlarmHistorian

    Stores Alarm and AuditTrail records.

    TagHistorian

    SQLite

    <ProjectNameAndPath>.dbTagHistorian

    Stores Tag Historian and Annotations.

    Any of them can be customized to any type of database. 

    The selection of best storage location depends on all kind of factors, from internal company procedures to the volume of data and how the data shall be used. Therefore, that is decision to each Project according to its requirements.

    If needed to use another database for the pre-defined connections, execute the following steps:

    Rename or Delete


    You can customize and link any DBs above with other databases. The optimal storage location for your DBs will depend on several factors, including internal company policies, data volume, and the intended use of the data. Therefore, the choice of storage solution depends entirely on each solution's specific requirements. If needed to use another database for the pre-defined connections, execute the steps below:

    1. Rename or delete the previous DB. This step is necessary, as the system would not allow

    to create
    1. the creation of two objects with the same name. 

    Crate
    1. Create a new DB with the same name
    of
    1. as the previous DB, with the required
    Database
    1. database and connection
    strings
    1. settings.

    Check

    the

    the Dataset DBs

    guide

     guide for

    comprehensive

    more details.

    AnchorDatasets

    Dataset Queries Configuration

    Datasets Queries ConfigurationDataset Queries ConfigurationUsers can enhance

    Users can enhance data manipulation by configuring queries with SQL statements in the Dataset module. The Visual Query Editor simplifies the process, allowing users to drag and drop tables, define relationships, and apply filters through an intuitive graphical interface. Check the Visual SQL Query Builder page more information on this feature. Created queries can be saved and executed seamlessly within the module.


    Check the Visual SQL Query Builder page for complete information.
    Check the

    Access Dataset Queries

    guide

     guide for

    comprehensive

    more details.

    AnchorDatasets

    Dataset Tables Configuration

    Datasets Tables Configuration

    Use the Dataset Tables

    ConfigurationEffortlessly

    to manage data tables

    within the Dataset module. Configure

    and configure table properties such as structure, relationships, and data types.

    The

    It provides an intuitive interface

    streamlines customization, ensuring

    to streamline customization and ensure efficient data handling.


     Check

    the

    the Dataset Tables

    guide

     guide for

    comprehensive details

    more information.

    AnchorDatasets

    Dataset Files Configuration

    Datasets Files ConfigurationDataset Files ConfigurationCustomize

    The Dataset files are used to customize file interactions in the Dataset module

    with ease. Configure

    . With this feature you can configure file-related properties, define formats

    , Check the Dataset Files guide for detailed

    and establish connections.

    The module provides a user-friendly interface for managing various file types seamlessly.

     Access the Dataset Files guide for detailed configuration instructions.


    Working with the Dataset module

    Runtime Execution

    One of the key critical features of the Dataset Datasets module is the ability to execute SQL queries and retrieve data in real-time. Here are some ways to leverage the runtime Runtime execution features of the Dataset Modulemodule:

    • Create SQL queries to retrieve data from external databases.
    • Use query results to trigger events and actions within the platform environment.
    • Configure event triggers based on specific query criteria, such as changes to a
    • specific
    • particular data point or
    • a
    • an exceeded threshold value
    • being exceeded
    • .


    Intra-Module Interaction

    The

    Dataset

    Datasets module

    can be

    easily

    integrated

    integrates with other modules within the software environment. Here are some examples of how to use the Dataset module

    can be used

    in conjunction with other modules:

    • Alarm Manager: Configure alarms based on query results to trigger notifications and actions.
    • Visualization: Display query results on screens and dashboards using DataGrids and other visualization tools.
    • Scripting: Use query results to trigger custom scripts and perform complex data processing and analysis.

    By leveraging these integration options, users can gain greater insight and control over their data sources within the platform. With the ability to execute SQL queries and trigger actions based on query results, the Dataset module provides a powerful set of tools for working with data.

    • Displays Module: Visualization tools like Data Grids can present query results on screens and dashboards, creating custom views of the data that are accessible and easy to understand for your users.
    • Scripting Module: Custom scripts can reference query results and trigger specific actions, such as sending notifications, updating tags, or performing calculations, thereby implementing complex logic based on database data.
    • Alarm Module: Data can be used to configure or update alarm conditions, triggers, and actions, enabling the creation of dynamic alarm systems that respond to data changes
    • Unified Namespaces Module: You can utilize Tags from your project as dynamic placeholders within SQL queries or file names. This capability facilitates real-time data manipulation within the query strings. Moreover, you can map the results of the queries to Tags or employ Tags to populate values when inserting or modifying existing records. 

    The Datasets module drives bi-directional real-time communication between all Project Modules and the SQL databases. By leveraging these integration options, users can gain greater insight and control over their data sources within the platform. They can execute SQL queries and trigger actions based on query results, providing robust tools for working with data.


    External System Interaction

    The Dataset module enables seamless interaction with external databases, enhancing data integration and accessibility. The following is a list of possible databases you can connect:

    • SQL Server
    • PostgreSQL
    • Excel

    Read more information on the Dataset DBs page and its sub-pages.


    Advanced Datasets Topics

    Databases Connections And Interactions

    The Dataset module facilitates efficient database interactions by utilizing TServer services. While it doesn't directly interact with databases, it manages synchronous and asynchronous executions for optimal performance. Understand the differences between these methods reading more on the Databases Connection And Interactions page.

    Data Management

    The Dataset Module offers versatile methods for managing data and concurrency within solutions, including Data Table tags and Async Contents. Understand how to avoid conflicts and optimize performance, reading more on the Data Management page.

    Datasets Runtime Attributes

    The Datasets Namespace exposes properties and methods from the .NET objects used by the Historian Module execution. You can use these properties and methods on your Displays or to create Scripts and Alarms. The Dataset Runtime Attributes page lists all options available.

    Preventing SQL Injections

    When calling Stored Procedures in the database, there's a significant concern about SQL injection, especially when passing parameters as plain text in SQL statements. To mitigate this, we utilize the .Net API, where parameters are added to a list, effectively preventing code injection. Therefore, we advise using parameters in Stored Procedure calls with the following syntax:

    execute <StoredProcedureName> <@paramName1>={Tag.Tag1}, <@paramName2>="{Tag.TagStr}", <@paramName3>={Tag.OutValue} OUTPUT, ... , @RETURN_VALUE={Tag.RetValue} RETURNVALUE

    This approach parameterizes Stored Procedure calls and guards against SQL injection attacks. Each parameter value is strictly interpreted as a value and not part of the SQL command, providing a critical layer of security against unauthorized database manipulation.

    Network Gateway Access And Time Zone Handling

    In some scenarios, specific databases might not be accessible from all computers due to network topology or security constraints. The ServerIP column can redirect database access commands to a machine with the necessary permissions, provided you install our platform on that machine. Only TWebServer would be active in such cases, handling database interaction, receiving commands, and establishing connections. This method offers a workaround for database access restricted to specific machines.

    The platform treats all Date and Time type Tags as UTC. Consequently, the dates written to and read from the standard database, such as historical data and alarms, are also in UTC. If it's necessary to read and write dates in the local timezone, account for the time difference when preparing the SQL statement or using the WhereCondition  property. Other external data tables, from third parties or the client, can use any timezone. To assist in handling these data, you can configure the DateTimeMode column with the type of timezone used in the database (UTC or Local Time), and the system will automatically convert it when reading and writing to Tags.

    Backup Of Solutions SQLite Data Bases

    There are several ways to backup SQLite databases in a solution. One simple way is to make a copy of the database file itself, but this method may not be suitable if the database is actively being written to or is very large. Here are a few suggestions for backing up SQLite databases in your solution:

    • SQLite provides a built-in backup command that can be used to create a backup of a database file. The syntax for this command is as follows:

      sqlite3 source_database.sqlite .backup backup_file.sqlite

      This command creates a backup of the source database and stores it in a new file called backup_file.sqlite. You can then copy this file to a safe location for backup purposes.

    • SQLite also provides an Online Backup API that you can use to create a backup of a database file while your application is still using it. This API provides several benefits over the backup command, such as the ability to perform incremental backups and monitor the progress of the backup process. Read more on the SQLite Backup API documentation.

    Regardless of your chosen method, it's crucial to store your backup files in a secure location and test them regularly to ensure that you can use them to restore your database if needed.


    Troubleshooting and Best Practices

    Table of Contents
    maxLevel4
    minLevel3
    include#

    Common #Issues and Solutions

    #Issue: Connection loss between project and database

    Database Timeout Configuration: The database may have a timeout setting that automatically disconnects idle connections after a certain period of time. It's recommended to check the database's timeout setting and adjust it, if necessary, to ensure that the connection remains active overnight.

    Power Settings: It's also suggested to check the computer's power settings to ensure that it's not entering sleep or hibernation mode during a certain idle moment, which could cause loss of connection to the database. Adjusting these settings to keep the computer active during these idle moments may resolve the issue.

    #Issue: Database Connection Problem

    In the DB configuration, there is always a "Test" button to ensure that the connection is happening correctly. When there is a problem, the return of this button is an error message, usually returned by the database provider itself. The most common errors are: invalid user, invalid password, computer without access to the database, incorrect form of authentication.

    #Issue: Error accessing the Database Table

    Once the connection is established, the Table configuration is specific to a table. In the "Table" combobox, the list of available tables automatically appears. It is possible, via script, to change which table will be accessed. However, care must be taken that the table exists and that the configuration is done using the correct name. The same care must be taken when Queries are used, as it is the user's responsibility to type the correct table name, as well as the syntax of the separators.

    #Error in the syntax of the Query

    It is the user's responsibility to type the correct SQLStatement of a query using the QueryBuilder. Table name, column, values, all can generate an error if used incorrectly. For example: comparing different types may not return the expected result, strings in general should be in single quotes. The separators and clauses available can vary between databases, for example:

    SQLServer

    Code Block
    languagesql
    titleQuery syntax
    SELECT TOP 10 * FROM table WHERE column = value

    SQLite

    Code Block
    languagesql
    titleQuery syntax
    SELECT * FROM table WHERE column = value LIMIT 10;

    Oracle

    Code Block
    languagesql
    titleQuery syntax
    SELECT * FROM table WHERE column = value AND ROWNUM <= 10;

    or new Oracle version

    Code Block
    languagesql
    titleQuery syntax
    SELECT * FROM table WHERE column = value FETCH FIRST 10 ROWS ONLY;

    IBM DB2

    Code Block
    languagesql
    titleQuery syntax
    SELECT * FROM table WHERE column = value FETCH FIRST 10 ROWS ONLY;


    #ServerIP without TWebServer running on the remote machine

    In some cases, the computer may not have access to the database. In this case, it is possible to create a gateway, routing the commands to be executed on the computer that has access to the database. The ServerIP field should be configured with the IP and port (<IP>:<port>), pointing to this computer that has access permission. This computer must have the software with the TWebServer running installed. It will automatically perform this gateway service and send the commands to the database.

    #DataTable returned NULL

    When the return of a select or query is null, some error has occurred. Some common errors include: connection failure with the database, table not found, Dataset module is not running, incorrect query syntax. Check the return of the method using WithStatus when using a synchronous method or use the LastStatus and LastStatusMessage property when using asynchronous mode.

    #DataTable returned with 0 rows

    When this happens, in general, there is a connection with the database and the table name is correct. In this case, the search condition is usually wrong, or the table is really empty. Check if the column names are correct and if the separators and clauses are valid.

    #Dataset module is down

    Although the TServer is responsible for forwarding requests to the database, the management and communication with the TServer is done by the Dataset module, as well as the treatment of responses. Therefore, if you are having basic problems with access and execution of database access, the first thing to check is whether the module is set up to run and is actually running.

    #Very high response time

    Sometimes, it may seem that access to the database is not being made, but what might actually be happening is that some accesses can return a very large amount of data, or the database may be overloaded, or with a bad configuration, causing it to have a low performance. Or even, the network itself can be overloaded and slow, and all these factors can impact the response time. In these cases, it is important to execute the Query directly in the database environment to be sure that the problem is not on the side of the database. Do this and check how long the database itself takes to execute the query. It is also worth checking the volume of data exchanged to have an idea of ??the related side effects.

    #Update of a table with the wrong schema (select before update)

    The Dataset module uses ADO technology, and many things are resolved at the level of this API. When we are going to perform an Update on a table, the schema of the table and controls in the .Net DataTable type are used. Therefore, if you are going to perform an update passing a Tag or .net DataTable object as a parameter, it is important that this object respects the schema of the destination Table in the database. Normally, a Select command must have been given at some point to obtain the correct schema used by the bank. After this, it is easy to add, remove, and modify values in this DataTable and update it back to the physical table in the database.

    #Where condition CaseSensitive

    Case sensitivity in a WHERE clause depends on the database and the configuration of the database you are using. For example, in MySQL, queries are case-insensitive by default, which means 'abc' and 'ABC' would be considered equal. However, this can be changed with specific database settings. In SQL Server, case sensitivity is also determined by the database configuration. In PostgreSQL, queries are case-sensitive by default, so 'abc' and 'ABC' would be considered different. Therefore, it really depends on the specific database and the settings of that database. If you need to ensure case-insensitivity in a query, you can use functions like UPPER() or LOWER() to convert all values to upper or lower case before comparison. For example:

    Code Block
    languagesql
    titleQuery syntax
    SELECT * FROM table WHERE LOWER(column) = LOWER(value);

    This query will return records where the column matches the value, regardless of capitalization.

    #Performance

    The Dataset module's performance depends on many factors, including database performance, network latency, and the complexity of executing SQL queries. The platform will minimize overhead and execute queries as efficiently as possible. However, ultimately, the performance of the Dataset module is tied to these external factors. It's essential to design your database schema and queries with performance in mind and consider investing in high-quality hardware and network infrastructure to ensure optimal performance.

    Best Practices and #Recommendations

    #Error Handling

    Error handling in the Dataset module is straightforward. Suppose an error occurs during the execution of a command. The error message will update the module's Error property (Last Status). You can monitor this property to handle errors in your application. Furthermore, suppose an error occurs during the execution of a synchronous method. In that case, the process will return an empty Data Table, and update the Error property. Alternatively, you can call methods like SelectCommandWithStatus, where the status will be an output parameter in the method.

    These are some of the key points you should keep in mind while using the Dataset module. It's a powerful tool that provides much flexibility. However, it also requires careful use and understanding of its features and limitations. Always keep best practices in mind when designing and implementing your database interactions. Feel free to seek expert advice if you need clarification on anything

    Troubleshooting and Best Practices

    Table of Contents
    maxLevel4
    minLevel3
    include#

    Common #Issues and Solutions

    #Issue: Connection loss between project and database

    Database Timeout Configuration: The database may have a timeout setting that automatically disconnects idle connections after a certain period of time. It's recommended to check the database's timeout setting and adjust it, if necessary, to ensure that the connection remains active overnight.

    Power Settings: It's also suggested to check the computer's power settings to ensure that it's not entering sleep or hibernation mode during a certain idle moment, which could cause loss of connection to the database. Adjusting these settings to keep the computer active during these idle moments may resolve the issue.

    #Issue: Database Connection Problem

    In the DB configuration, there is always a "Test" button to ensure that the connection is happening correctly. When there is a problem, the return of this button is an error message, usually returned by the database provider itself. The most common errors are: invalid user, invalid password, computer without access to the database, incorrect form of authentication.

    #Issue: Error accessing the Database Table

    Once the connection is established, the Table configuration is specific to a table. In the "Table" combobox, the list of available tables automatically appears. It is possible, via script, to change which table will be accessed. However, care must be taken that the table exists and that the configuration is done using the correct name. The same care must be taken when Queries are used, as it is the user's responsibility to type the correct table name, as well as the syntax of the separators.

    #Error in the syntax of the Query

    It is the user's responsibility to type the correct SQLStatement of a query using the QueryBuilder. Table name, column, values, all can generate an error if used incorrectly. For example: comparing different types may not return the expected result, strings in general should be in single quotes. The separators and clauses available can vary between databases, for example:

    SQLServer

    Code Block
    languagesql
    titleQuery syntax
    SELECT TOP 10 * FROM table WHERE column = value

    SQLite

    Code Block
    languagesql
    titleQuery syntax
    SELECT * FROM table WHERE column = value LIMIT 10;

    Oracle

    Code Block
    languagesql
    titleQuery syntax
    SELECT * FROM table WHERE column = value AND ROWNUM <= 10;

    or new Oracle version

    Code Block
    languagesql
    titleQuery syntax
    SELECT * FROM table WHERE column = value FETCH FIRST 10 ROWS ONLY;

    IBM DB2

    Code Block
    languagesql
    titleQuery syntax
    SELECT * FROM table WHERE column = value FETCH FIRST 10 ROWS ONLY;

    #ServerIP without TWebServer running on the remote machine

    In some cases, the computer may not have access to the database. In this case, it is possible to create a gateway, routing the commands to be executed on the computer that has access to the database. The ServerIP field should be configured with the IP and port (<IP>:<port>), pointing to this computer that has access permission. This computer must have the software with the TWebServer running installed. It will automatically perform this gateway service and send the commands to the database.

    #DataTable returned NULL

    When the return of a select or query is null, some error has occurred. Some common errors include: connection failure with the database, table not found, Dataset module is not running, incorrect query syntax. Check the return of the method using WithStatus when using a synchronous method or use the LastStatus and LastStatusMessage property when using asynchronous mode.

    #DataTable returned with 0 rows

    When this happens, in general, there is a connection with the database and the table name is correct. In this case, the search condition is usually wrong, or the table is really empty. Check if the column names are correct and if the separators and clauses are valid.

    #Dataset module is down

    Although the TServer is responsible for forwarding requests to the database, the management and communication with the TServer is done by the Dataset module, as well as the treatment of responses. Therefore, if you are having basic problems with access and execution of database access, the first thing to check is whether the module is set up to run and is actually running.

    #Very high response time

    Sometimes, it may seem that access to the database is not being made, but what might actually be happening is that some accesses can return a very large amount of data, or the database may be overloaded, or with a bad configuration, causing it to have a low performance. Or even, the network itself can be overloaded and slow, and all these factors can impact the response time. In these cases, it is important to execute the Query directly in the database environment to be sure that the problem is not on the side of the database. Do this and check how long the database itself takes to execute the query. It is also worth checking the volume of data exchanged to have an idea of ??the related side effects.

    #Update of a table with the wrong schema (select before update)

    The Dataset module uses ADO technology, and many things are resolved at the level of this API. When we are going to perform an Update on a table, the schema of the table and controls in the .Net DataTable type are used. Therefore, if you are going to perform an update passing a Tag or .net DataTable object as a parameter, it is important that this object respects the schema of the destination Table in the database. Normally, a Select command must have been given at some point to obtain the correct schema used by the bank. After this, it is easy to add, remove, and modify values in this DataTable and update it back to the physical table in the database.

    #Where condition CaseSensitive

    Case sensitivity in a WHERE clause depends on the database and the configuration of the database you are using. For example, in MySQL, queries are case-insensitive by default, which means 'abc' and 'ABC' would be considered equal. However, this can be changed with specific database settings. In SQL Server, case sensitivity is also determined by the database configuration. In PostgreSQL, queries are case-sensitive by default, so 'abc' and 'ABC' would be considered different. Therefore, it really depends on the specific database and the settings of that database. If you need to ensure case-insensitivity in a query, you can use functions like UPPER() or LOWER() to convert all values to upper or lower case before comparison. For example:

    Code Block
    languagesql
    titleQuery syntax
    SELECT * FROM table WHERE LOWER(column) = LOWER(value);

    This query will return records where the column matches the value, regardless of capitalization.

    Best Practices and #Recommendations

    #recommendation: Error accessing the Database Table

    Dataset Runtime Attributes

    The Dataset namespace exposes properties and methods of the .NET objects used by the Dataset module execution.

    For more information on namespaces and objects, go to Objects and Attributes.

    Dataset module Properties examples

    Property

    Type

    Description

    Example

    CursorIndex

    Integer

    Represents the current position of a cursor within the result set of a query. It is an integer value that indicates the row number in the result set that the cursor points to, with the first row having an index of 0. The cursor can be used to navigate through the result set, allowing you to read, update, or delete rows of data.

    Dataset.Query.QueryName.CursorIndex + 1

    OpenStatusMessage

    String

    Provides a human-readable message describing the current state of the database connection. This can be useful for monitoring the connection status, diagnosing issues, or displaying the connection status to users in a user-friendly format.

    Dataset.DB.DatabaseName.OpenStatusMessage

    ConnectionString

    String

    Used to store the necessary information for establishing a connection to a database. It is a string value containing details such as server name, database name, authentication credentials, and other relevant parameters.

    Dataset.DB.ProviderName.ConnectionString
    This section describes only some commonly used properties, for the full list properties and methods, go to the Namespaces Reference

    .



    In this section:

    Page Tree
    root@self
    spacesV10

    ...