HTML |
---|
<style>
.text-span-6 {
background-image: linear-gradient(99deg, rgba(170, 163, 239, .5), rgba(125, 203, 207, .5));
border-radius: 50px;
padding-left: 15px;
padding-right: 15px;
}
#title-text {
display: none;
}
.panelgradient {
background-image: linear-gradient(180deg, #d5def0, whitesmoke);
border-radius: 8px;
flex-direction: column;
justify-content: center;
align-items: center;
padding: 4rem;
display: flex;
position: relative;
}
</style>
<div class ="panelgradient">
<h1 style="text-align: center;">Datasets <br> (Databases and SQL Queries)</h1>
</div> |
Introduction to the Datasets Module
The Datasets Module is designed for data exchange with SQL databases and text files from a diverse set of sources. Essentially, the Datasets Module drives bi-directional real-time communication between all modules and the SQL databases.
This module offers compatibility with various database technologies, including ADO.NET, ODBC, OleDB, and native interfaces, providing straightforward configuration with prominent databases such as SQL Server, Oracle, SQLite, and PostgreSQL. Features include:
- Multi-threaded concurrent connections with multiple databases for efficient data handling
- SQL Query editor, SQLite admin tool, and a Visual Query Builder, streamlining the configuration experience
- Customization of SQL statements in real-time with tags and system events
Introduction
The Dataset Module is a versatile tool designed for collecting, analyzing, and visualizing data from various sources within the platform. Supporting a wide range of database technologies, such as ADO.NET, ODBC, OleDB, and native interfaces, it offers seamless connectivity to key databases, including SQL Server, Oracle, SQLite, PostgreSQL, and more, making configuration simple and efficient.
Designed with real-time applications in mind, the module boasts several useful features such as multi-threaded concurrent connections, a built-in editor for SQLite, and an intuitive visual query builder. Additionally, it enables users to integrate real-time tags within query strings and manage files and recipes in ASCII, Unicode, or XML formats, enhancing the overall functionality and user experience.
On this page:
Table of Contents | ||
---|---|---|
|
Purpose and Key Concepts
Dataset DBs
In order for the Dataset Module to communicate with an external database, a connection must be created with certain parameters. These connections, which are created within the Datasets → DBs section of the module, are referred to as Dataset DBs.
Dataset Queries
In the context of the Dataset Module, a Dataset Query refers not only to an SQL query string, but also to a Project object that has a logical name, an SQL query related to that logical name, and other parameters defined within the Datasets → Queries section. There are many ways to automatically map the results of a query execution with Tags.
Dataset Tables
A Dataset Table is a logical name that is created within a project to set up access to a specific table in a connected database. These tables are listed within the Datasets → Tables section of the module. The Tags in the real-time database can easily be mapped to columns in the tables to perform insert, update, or read operations.
Dataset Files
A Dataset File is a logical name that defines parameters for reading and writing files in ASCII, Unicode, or XML formats.
Understanding the Datasets Module
Features Highlights
SQL Query Support: The Dataset Module provides support for SQL queries, allowing users to easily extract, manipulate, and transform data from a variety of sources.
Integration with External Data Sources: The Dataset Module can integrate with a wide range of external data sources, including databases, CSV files, and other external sources, providing a flexible and powerful tool for data collection and analysis.
Access Types: Access Types allow users to group and organize data points based on their usage and permissions, providing a powerful tool for managing and controlling access to data within the Dataset Module.
Visual Query Editor: The Visual Query Editor provides a user-friendly interface for creating and editing SQL queries, making it easy for users to define complex queries without needing extensive SQL knowledge.
Customizable Dashboards: The Dataset Module provides the ability to create custom dashboards and visualizations based on the data collected by the module, allowing users to easily view and analyze data in real-time.
Store Procedures Execution: The Dataset Module can execute Store Procedures and return the results to the platform, allowing users to perform advanced data manipulation and analysis within the context of the platform.
Real-Time Execution: The Dataset Module supports real-time execution of SQL queries, allowing users to monitor and analyze data as it is generated in real-time.
Datasets Data Server Service
The Datasets Data Server Service is an essential part of the Datasets Module. It is responsible for managing and providing efficient access to datasets, which are structured collections of data used in various project applications such as data analysis, reporting, and visualization. The service ensures high performance and seamless integration with other components, offering flexibility and ease of use when working with datasets.
Processing Requests from Other Modules
The Processing Requests from Other Modules feature in FactoryStudio enables smooth communication between different project modules. It handles data requests and interactions, ensuring efficient data exchange and coordination among various components such as HMI/SCADA screens, scripting, and reporting tools.
Databases Used when Running the Project
The platform has pre-configured databases that store essential project information, including real-time and historical data, alarms, events, and system configurations. These databases provide a reliable and efficient foundation for data storage and retrieval, allowing users to focus on building and customizing their projects without worrying about database setup and management.
Data Source Virtualization Benefits
Data Source Virtualization is an advanced feature that simplifies data management across multiple data sources. It offers a unified interface for accessing, querying, and manipulating data, regardless of the underlying data storage technology, ensuring flexibility and ease of use.
Agnostics, Standards, Centralized Management
FactoryStudio's Data Source Virtualization is designed to be agnostic and adhere to industry standards, allowing it to work seamlessly with various data storage technologies, such as SQL databases, OPC UA servers, or custom data sources. This approach enables centralized management of data connections and configurations, streamlining the process of integrating different data sources into your project.
|
Key Concepts and Terms
DatasetDB
Connections settings created by the Dataset Module to communicate with an external database.
DatasetQuery
Logical name associated with the configuration for SQL query statements with a Database, and its properties and methods for running queries.
DatasetTable
Logical name created to hold configuration settings to access specific tables in a connected database, mapping tags to table columns for operations.
DatasetFile
Logical name defining parameters for reading and writing files in ASCII, Unicode, or XML formats.
Understanding the Datasets Module
The Datasets Module enables users to interact with SQL databases seamlessly. The module supports real-time Tags within SQL statements, and manages files and recipes in ASCII, Unicode, or XML formats.
The data retrieved from databases can be utilized in various ways throughout your solution. For example:
- In the Displays Module: Visualization tools like DataGrids can present query results on screens and dashboards, creating custom views of the data that are accessible and easy to understand for your users.
- In the Scripting Module: Custom scripts can reference query results and trigger specific actions, such as sending notifications, updating tags, or performing calculations, thereby implementing complex logic based on database data.
- Devices: Sending data from field equipments to a SQL database, or applying settings from the database to the field equipments.
Pre-defined Database Connections
The Dataset Module also serves as a data storage configuration hub for other modules. The following Database connections are pre-defined by the Dataset Module.
- AlarmHistorian: Events and records for long-term retention.
- TagHistorian: Time-series storage for process variables,
- RuntimeUsers: Dynamics users and credentials created when running the solution.
- Retentive: Persistent records for tags and properties that need to be kept across multiple starts of the solution (typically configuration settings and setpoints).
Processing Data Requests
The Datasets Module has its implementation running as a service, which ensures high performance and real-time responses to multiple client requests.
This architecture also enhances protection and security for the database, as client displays and scripts won't access the databases directly, but through the Datasets Service.
Another benefit is the ability for Data Source Virtualization, meaning that when the solution is using Dataset.Query.Query1
in its displays or scripts, the database running that query, along with the query itself, can be maintained or replaced without affecting the overall solution configuration. This feature allows the solution to work with the data, regardless of the underlying data storage technology.
For an advanced deeper understanding of the Datasets Services, see Dataset Advanced Topics.
Configuring the Datasets Module
Configuration Workflow
The typical configuration workflow for the Dataset module has the following sequence:
Datasets Module Configuration Workflow | ||
---|---|---|
Action | Where | Comments |
Define database connections | Datasets / DBs | Gather connection details for your applications databases and created DB objects as need. Leverage the built-in SQLite admin tool for temporary development purposes |
Prepare Queries | Datasets / Queries DataExplorer / SQL VisualQueryBuilder | To craft queries using the built-in SQL Language Editor, the VisualQueryBuilder or using provided SQL statements from other sources. Fine-tune queries adding real-time parameters. Eg.: Transform "WHERE col1 = 5" to "WHERE col1 = {{tag.Test}}". |
Map Database Tables | Datasets / Tables | Optionally, you can establish a direct mapping to tables within the Database. |
Map Recipes and Text files | Datasets / Files | Optionally, your solution may need to save or load recipes, or other information, from ASCII, Unicode, or XML files. |
Managing DB Connections
There are four database connections pre-defined in any new solution.
Datasets DB - Pre-defined database connections | |||
---|---|---|---|
DB | Database | Path Location | Usage |
Retentive | SQLite | <ProjectNameAndPath>.dbRetentive | Stores values for the Tags with the Retentive property set. |
RuntimeUsers | SQLite | <ProjectNameAndPath>.dbRuntimeUsers | Stores dynamically created Solution SecurityUsers. |
AlarmHistorian | SQLite | <ProjectNameAndPath>.dbAlarmHistorian | Stores Alarm and AuditTrail records. |
TagHistorian | SQLite | <ProjectNameAndPath>.dbTagHistorian | Stores Tag Historian and Annotations. |
When using SQLite databases, the Dataset Module can automatically create the database locally if it doesn't already exist. For other database types, the database itself must already exist before you set your connection.
→ Read more about Datasets DBs.
DatasetQueries Configuration
Use the DatasetQueries to define SQL Statements, for queries and stored procedures to execute in connection of the created DatasetDB databases.
→ Read more about Datasets Queries
DatasetTables Configuration
Use the DatasetTables to access or exchange data with databases tables, with simplified query syntax. It allows allow insert new rows directly on database tables.
→ Read more about Datasets Tables.
DatasetFiles Configuration
The DatasetFiles are used to customize file interactions in the Dataset Module. With this feature you can read or write realtime tags to ASCII, Unicode and XAML files.
→ Read more about Datasets Files.
Working with the Datasets Module
Runtime Execution
When executing the solution, there is an infrastructure of services that manages access to the database and transports that information to where it is requested. For instance, to display a Dataset Query result on an HTML5 page, that request first goes to the server, which then requests the database (which can be on another computer), and the information flows back to the requester.
As database operations can take some time to execute, it is very important to understand some aspects of the Datasets Module execution, including the concept of synchronous vs. asynchronous requests.
The page Datasets Module Execution details concepts that describe the module's internal operations.
Showing DataGrids Tables on Displays
One typical use of the Dataset Module is to display query results on displays and dashboards.
In order to do so, create the DatasetQuery, or DatasetTable, then use the DataGrid Control on your displays.
Using Query Results on Scripts and Tags
It's possible to define the SQL statements with code (either using the Scripts Module or Display CodeBehind) and connect the results with tags early on.
The property Dataset.Query.QueryName.SqlStatement
holds the query that will be executed; just modify that property within your scripts.
The Tag Type DATATABLE was created to be compatible with results of Select()
statements. Simply apply the results to your query and use tags to manage the information.
The TK (Toolkit extension for Scripts) has methods that allow for easy copying between DataTables (query results) and Template Tags, like TK.CopyTagToDataTable()
.
Monitoring Databases Connection Status
Monitoring Database Connections is an essential aspect of maintaining a reliable operation of the solution.
This can be accomplished using the Dataset Namespace properties, which provide status for DatasetTables and DatasetQueries operations.
→ Read more about Datasets Runtime Attributes.
During the Development phase, when the Designer tools is connected with a Runtime (the Solution is in execution), the main status conditions can be seen in the monitoring page.
→ Read more about Datasets Monitor
Datasets Advanced Topics
Datasets Module Execution
The Dataset module facilitates efficient database interactions by utilizing TServer services, managing synchronous and asynchronous executions for optimal performance.
→ Read more about Databases Module Execution.
Data Management
The Dataset Module offers versatile methods for managing data and concurrency within solutions, including Data Table tags and Async Contents.
→ Read more about at Data Management.
Datasets Runtime Attributes
The Datasets Namespace exposes properties and methods from the .NET objects used by the Historian Module execution. You can use these properties and methods on your Displays or to create Scripts and Alarms.
→ Read more about at Datasets Runtime Attributes.
Preventing SQL Injections
Prevent SQL injections by using parameterized queries or stored procedures with proper parameter binding. Avoid concatenating user inputs directly into SQL statements. In .NET, use APIs to add parameters to commands. This ensures inputs are treated as values, not executable code.
→ Read more about at Datasets Advanced Topics.
Network Gateway Access And Time Zone Handling
To access databases restricted by network or security constraints, use the ServerIP column to redirect commands through a machine with permissions, running the TWebServivces. For time zone handling, the platform uses UTC for date and time tags, to manage conversions between UTC and local time use the DateTimeMode column .
→ Read more about at Datasets Advanced Topics.
Backup Of Solutions SQLite Data Bases
To back up SQLite databases, use the sqlite3
command to create a copy with .backup
, or use the Online Backup API for incremental and active-use backups. Ensure secure storage and regular testing of backup files for reliability.
→ Read more about at Datasets Advanced Topics.
Anchor | ||||
---|---|---|---|---|
|
Common Issues and Solutions
Connection loss between project and database
Database Timeout Configuration: The database may have a timeout setting that automatically disconnects idle connections after a certain period. It's recommended to check the database's timeout setting and adjust it, if necessary, to ensure that the connection remains active overnight.
Power Settings: It's also suggested to check the computer's power settings to ensure that it doesn't enter sleep or hibernation mode during idle moments, which could cause a loss of connection to the database. Adjusting these settings to keep the computer active during these idle moments may resolve the issue.
Database Connection Problem
In the DB configuration, there is always a "Test" button to ensure that the connection is happening correctly. When there is a problem, the return of this button is an error message, usually returned by the database provider itself. The most common errors are: invalid user, invalid password, computer without access to the database, incorrect form of authentication.
Issue: Error accessing the Database Table
Once the connection is established, the Table configuration is specific to a table. In the "Table" combobox, the list of available tables automatically appears. It is possible, via script, to change which table will be accessed. However, care must be taken that the table exists and that the configuration is done using the correct name. The same care must be taken when Queries are used, as it is the user's responsibility to type the correct table name, as well as the syntax of the separators.
Error in the Syntax of the Query
It is the user's responsibility to type the correct SQLStatement of a query using the QueryBuilder. Table name, column, values, all can generate an error if used incorrectly. For example: comparing different types may not return the expected result, strings in general should be in single quotes. The separators and clauses available can vary between databases. For example:
SQLServer
Code Block | ||||
---|---|---|---|---|
| ||||
SELECT TOP 10 * FROM table WHERE column = value |
SQLite
Code Block | ||||
---|---|---|---|---|
| ||||
SELECT * FROM table WHERE column = value LIMIT 10; |
Oracle
Code Block | ||||
---|---|---|---|---|
| ||||
SELECT * FROM table WHERE column = value AND ROWNUM <= 10; |
new Oracle version
Code Block | ||||
---|---|---|---|---|
| ||||
SELECT * FROM table WHERE column = value FETCH FIRST 10 ROWS ONLY; |
IBM DB2
Code Block | ||||
---|---|---|---|---|
| ||||
SELECT * FROM table WHERE column = value FETCH FIRST 10 ROWS ONLY; |
Oracle DB size limitation for nchar
The Oracle database has a size limitation of 2000 bytes for columns of the nchar
type. Some columns we were creating exceeded this limit, which caused the error. Specifically, the affected columns are:
TagsDictionary:
TagName
columnRuntimeUsers:
Profile
andPasswordHistory
columns
To address this, when the Dataset module detects an Oracle DB connection, it automatically adjusts the length of these columns from 2048 to 1999. For all other database providers, the column lengths remain unchanged.
ServerIP without TWebServer Running on the Remote Machine
In some cases, the computer may not have access to the database. In this case, it is possible to create a gateway, routing the commands to be executed on the computer that has access to the database. The ServerIP field should be configured with the IP and port (<IP>:<port>), pointing to this computer that has access permission. This computer must have the software with the TWebServer running installed. It will automatically perform this gateway service and send the commands to the database.
DataTable Returned NULL
When a query is returned null, some error has occurred. Some common errors include: connection failure with the database, table not found, Dataset module is not running, incorrect query syntax. Check the return of the method using WithStatus when using a synchronous method or use the LastStatus and LastStatusMessage property when using asynchronous mode.
DataTable Returned with 0 Rows
When this happens, in general, there is a connection with the database and the table name is correct. In this case, the search condition is usually wrong, or the table is really empty. Check if the column names are correct and if the separators and clauses are valid.
Dataset Module is Down
Although the TServer is responsible for forwarding requests to the database, the management and communication with the TServer is done by the Dataset module, as well as the treatment of responses. Therefore, if you are having basic problems with access and execution of database access, the first thing to check is whether the module is set up to run and is actually running.
Very High Response Time
Sometimes, it may seem that access to the database is not being made, but what might actually be happening is that some accesses can return a very large amount of data, or the database may be overloaded, or with a bad configuration, causing it to have a low performance. Or even, the network itself can be overloaded and slow, and all these factors can impact the response time. In these cases, it is important to execute the Query directly in the database environment to be sure that the problem is not on the side of the database. Do this and check how long the database itself takes to execute the query. It is also worth checking the volume of data exchanged to have an idea of the related side effects.
Update of a table with the wrong schema (select before update)
The Dataset module uses ADO technology, and many things are resolved at the level of this API. When we are going to perform an Update on a table, the schema of the table and controls in the .NET DataTable type are used. Therefore, if you are going to perform an update passing a Tag or .NET DataTable object as a parameter, it is important that this object respects the schema of the destination Table in the database. Normally, a Select command must have been given at some point to obtain the correct schema used by the bank. After this, it is easy to add, remove, and modify values in this DataTable and update it back to the physical table in the database.
Where condition CaseSensitive
Case sensitivity in a WHERE
clause depends on the database and the configuration of the database you are using. For example, in MySQL, queries are case-insensitive by default, which means 'abc' and 'ABC' would be considered equal. However, this can be changed with specific database settings. In SQL Server, case sensitivity is also determined by the database configuration. In PostgreSQL, queries are case-sensitive by default, so 'abc' and 'ABC' would be considered different. Therefore, it really depends on the specific database and the settings of that database. If you need to ensure case-insensitivity in a query, you can use functions like UPPER()
or LOWER()
to convert all values to upper or lower case before comparison. For example:
Code Block | ||||
---|---|---|---|---|
| ||||
SELECT * FROM table WHERE LOWER(column) = LOWER(value); |
This query will return records where the column matches the value, regardless of capitalization.
Performance
The Dataset module's performance depends on many factors, including database performance, network latency, and the complexity of executing SQL queries. The platform will minimize overhead and execute queries as efficiently as possible. However, ultimately, the performance of the Dataset module is tied to these external factors. It's essential to design your database schema and queries with performance in mind and consider investing in high-quality hardware and network infrastructure to ensure optimal performance.
Best Practices and Recommendations
Error Handling
Error handling in the Dataset module is straightforward. If an error occurs during the execution of a command, the error message will update the module's Error property (Last Status). You can monitor this property to handle errors in your application. Furthermore, if an error occurs during the execution of a synchronous method, the process will return an empty DataTable and update the Error property. Alternatively, you can call methods like SelectCommandWithStatus
, where the status will be an output parameter in the method.
In this section:
Configuring the Dataset Module
Learn how to connect to data sources, create queries, and optimize performance for efficient data management.
This section provides essential guidance for setting up and customizing the Dataset Module, including:
- Datasets and SQL Queries
- Managing DB Connections
- Datasets Queries Configuration
- Datasets Tables Configuration
- Datasets and SQL Queries
The typical configuration workflow for the Dataset Module has the following sequence:
When using SQLite databases, the Module Dataset can automatically create the Database if necessary; for other ones, the Database itself must already exist before you set your connection. Users with any Permission groups can create new connections in the Project, but only the Administrator can configure databases password logins.
To create a new Database connection:
Go to Datasets → DBs.
Click Create New. The Create New Database Connection window displays.
Enter or select information, as needed.
Click OK. The database is added as a new row in the table.
Edit the row fields to modify the required settings.
Dataset DB Configuration Properties
Column
Description
Name
Enter a name for the database configuration. The system allows you to know if the name is not valid.
Provider
Identifies the Provider technology used in this connection
Database
Identifies to which type of dataset is this connection
ConnectionString
Enter the information needed to connect with the database. You use macros on the connection string.
Example: for the filename in a SQLite connection string, use <ProjectName> that is replaced by the name of the project.
LogonName
Enter a valid login name for the database.
LogonPassword
Enter the password that corresponds to the database login. (Only accessible by Administrators)
ServerIP
Optionally, an IP or DNS name for a computer to be used as a Secure Gateway.
Description
Enter a description for the database connection.
Please check the Connecting with SQL Server and Connecting with Excel for additional information.
There are four database connection already created in any new Project:
Datasets DB - Pre-defined database connections
DB
Database
Path Location
Usage
Retentive
SQLite
<ProjectNameAndPath>.dbRetentive
Stores values for Retentive Tags.
RuntimeUsers
SQLite
<ProjectNameAndPath>.dbRuntimeUsers
Stores dynamically created Users.
AlarmHistorian
SQLite
<ProjectNameAndPath>.dbAlarmHistorian
Stores Alarm and AuditTrail records.
TagHistorian
SQLite
<ProjectNameAndPath>.dbTagHistorian
Stores Tag Historian and Annotations.
Any of them can be customized to any type of database.
The selection of best storage location depends on all kind of factors, from internal company procedures to the volume of data and how the data shall be used. Therefore, that is decision to each Project according to its requirements.
If needed to use another database for the pre-defined connections, execute the following steps:
Rename or Delete the previous DB. This step is necessary, as the system would not allow to create two objects with the same name.
Crate a new DB with the same name of the previous DB, with the required Database and connection strings.
You can configure queries to perform more advanced functions with SQL statements to work with data from external databases.
To configure Dataset queries:
Go to Datasets → Queries.
Enter the field values as needed.
Dataset Query Configuration Properties
Column
Description
Name
Enter a name for the query. The system allows you to know if the name is not valid.
DB
Select the database configuration.
SqlStatement
Enter the query using SQL syntax.
Mapping
Click "..." to select the tags that you want to populate with data from specific columns returned by the query.
MappingDateTime
Select the time reference (UTC or Local).
Description
Enter a description for the table configuration.
With the Visual Query Editor, users can drag and drop tables, define relationships, and add filters and conditions using a simple graphical interface. Once the query is created, it can be saved and executed like any other query within the Dataset Module.
Check the Visual SQL Query Builder page for complete information.
To configure dataset tables:
Go to Datasets → Tables.
Enter the field values as needed.
Dataset Table Configuration Properties
Field / Column
Description
Name
Enter a name for the table configuration. The system lets you know if the name is not valid.
DB
Select the database connecton.
TableName
Select or type the table name in the Database you want to access
WhereCondition
Specify the parameters that will filter the data using SQL syntax. E.g. "ColumnName = {tag.tagInt}
"
Access
Select the access permissions for the table.
Mapping
Click "..." to select the tags that you want to populate with data in the first row of the table with data from specific columns.
MappingDateTime
Select the time reference (UTC or Local).
Description
Enter a description for the table configuration.
To configure dataset files:
Go to Datasets → Files.
Enter the field values as needed.
Dataset File Configuration Properties
Column
Description
Name
Enter a name for the file configuration. The system allows you to know if the name is not valid.
FileName
Enter the full path to the file. The file path can have Tag values embedded using curly brackets syntax. E.g.: ExampleFile{{tag.Test}}.txt.
When executing, the area in curly brackets is replaced by the value of the Tag.
FileType
Select the type of file.
Objects
Click "..." to select the tags that you want to populate with data from the file with data from specific columns.
Description
Enter a description for the file configuration.
XMLSchemaType
Represents the schema type of an XML file, which can be: a TagList, XML that contains a tag list with the tag name and tag value; or a TagObject, XML that contains the entire tag tree and its children.
Working with the Dataset Module
Runtime Execution
One of the key features of the Dataset Module is the ability to execute SQL queries and retrieve data in real-time. Here are some ways to leverage the runtime execution features of the Dataset Module:
- Create SQL queries to retrieve data from external databases.
- Use query results to trigger events and actions within the platform environment.
- Configure event triggers based on specific query criteria, such as changes to a specific data point or a threshold value being exceeded.
The Dataset Module can be easily integrated with other modules within the software environment. Here are some examples of how the Dataset Module can be used in conjunction with other modules:
- Alarm Manager: Configure alarms based on query results to trigger notifications and actions.
- Visualization: Display query results on screens and dashboards using DataGrids and other visualization tools.
- Scripting: Use query results to trigger custom scripts and perform complex data processing and analysis.
By leveraging these integration options, users can gain greater insight and control over their data sources within the platform. With the ability to execute SQL queries and trigger actions based on query results, the Dataset Module provides a powerful set of tools for working with data.
Monitoring Databases Connections
Monitoring Database Connections is an essential aspect of maintaining a reliable and efficient system within the platform. By keeping track of database connections, you can ensure that your data is being accessed and updated correctly. Here are some ways to monitor database connections:
- Connection Status: Use the ConnectionStatus runtime attribute to check if a database connection is active or inactive. This can help you identify any connection issues and take corrective action when necessary.
- Query Status: The QueryStatus runtime attribute indicates whether a query is currently executing or not. This information can help you monitor query performance and identify potential bottlenecks or issues.
- Query Execution Metrics: Keep track of query execution details using attributes like QueryLastExecution, which shows the date and time of the last query execution, and QueryExecutionCount, which indicates the number of times the query has been executed. These metrics can provide insights into system performance and help you optimize your queries.
- Data Acquisition Rate: Monitor the rate at which data is being acquired by the Dataset Module using the DataAcquisitionRate attribute. This information can help you identify potential issues with data retrieval and ensure that your system is operating efficiently.
Showing DataGrids Tables on Displays
One of the key features of the Dataset Module is the ability to display query results on screens and dashboards using visualization tools like DataGrids. Here are some steps for using DataGrids to display query results:
- Create a query in the Dataset Module to retrieve the desired data.
- In the Visualization Module, add a DataGrid control to the screen or dashboard.
- Configure the DataGrid to display the query results by selecting the data source and column mappings.
- Save and preview the screen or dashboard to display the query results on the DataGrid.
Using Query Results on Scripts and Tags
Users can use query results to trigger actions in custom scripts and tags. Here are some steps for using query results in scripts and tags:
- Create a query in the Dataset Module to retrieve the desired data.
- In the Scripting Module, create a custom script that references the query results.
- Use the query results to trigger specific actions within the script, such as sending notifications or updating tags.
- Save and execute the script to perform the desired actions.
Check the Using Stored Procedures page for additional information.
Dataset Module Runtime Attributes
The Dataset namespace exposes properties and methods of the .NET objects used by the Dataset Module execution.
For more information on namespaces and objects, go to Objects and Attributes.
This section describes only some commonly used properties, for the full list properties and methods, go to the Namespaces Reference.
Dataset Module Properties examples
Property
Type
Description
Dataset.IsStarted
Boolean
Flag indicating if the Module Dataset has started
Dataset.OpenStatusMessage
String
Message OK or error when initiating the Module
Dataset.Query.Query1.SelectCommand()
DataTable
Executes the Query1 return a DataTable object the values
- ConnectionStatus: indicates whether the database connection is active or inactive.
- QueryStatus: indicates whether the query is currently executing or not.
- QueryLastExecution: indicates the date and time of the last query execution.
- QueryExecutionCount: indicates the number of times the query has been executed.
- DataAcquisitionRate: indicates the rate at which data is being acquired by the Dataset Module.
Troubleshooting and Best Practices
<<<<For each issue/error, there must have a step-by-step procedure about how to apply the solution.>>>>
Table of Contents | ||||||
---|---|---|---|---|---|---|
|
Common #Issues and Solutions
Best Practices and #Recommendations
In this section...Page Tree | ||||
---|---|---|---|---|
|
...