Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Overview

The Dataset Module connects your solution to databases through TServer services, managing data flow, security, and performance. This reference covers essential practices for production deployments, including concurrency management, security hardening, and performance optimization.

Architecture & Execution Model

How the Dataset Module Works

The Dataset Module operates as an intermediary layer between your solution and databases:

  • TServer Integration: All database operations route through TServer services
  • Execution Modes: Supports both synchronous (blocking) and asynchronous (non-blocking) operations
  • Result Processing: Returns .NET DataTable objects for flexible data manipulation
  • Distribution: Propagates data to displays, reports, scripts, and other modules

Synchronous vs Asynchronous Operations

Synchronous Execution:

  • Blocks execution until database responds
  • Returns DataTable directly to calling method
  • Stores result in Local Contents property
  • Best for: Small queries, immediate data needs

...

  • Non-blocking operation
  • Updates Async Contents property when complete
  • Enables row-by-row navigation
  • Best for: Large datasets, UI responsiveness

Concurrency & Server Domain Management

Understanding Server Domain Attributes

All Dataset Module properties exist in the server domain, creating a shared resource environment:

Example Risk Scenario:
1. Client A sets: SQLStatement = "SELECT * FROM Orders WHERE Status='Open'"
2. Client B sets: SQLStatement = "SELECT * FROM Orders WHERE Status='Closed'"
3. Execute command runs with Client B's statement (last write wins)

Preventing Concurrency Conflicts

Strategy 1: Dedicated Query Objects

...

  • Execute queries in scripts with local variables
  • Process DataTables before assignment to tags
  • Minimize server domain property modifications

Data Management Strategies

DataTable Usage Patterns

The Dataset Module provides three primary patterns for DataTable management:

...

// Configure mapping, then navigate rows
@Dataset.Query.Query1.Select();
@Dataset.Query.Query1.Next(); // Moves to next row

Memory & Traffic Optimization

Control Data Volume:

  • Always use WHERE clauses to filter results
  • Implement pagination for large datasets
  • Use SELECT specific columns instead of SELECT *
  • Consider indexed views for complex queries

...

Small Query: < 1,000 rows = Minimal impact
Medium Query: 1,000-10,000 rows = Monitor memory usage
Large Query: > 10,000 rows = Implement pagination

Security Implementation

SQL Injection Prevention

Never Do This:

string query = "SELECT * FROM Users WHERE Name = '" + userInput + "'";

...

  • Treats all parameters as values, not code
  • Prevents malicious SQL execution
  • Maintains data type integrity
  • Supports all major database platforms

Network Security Architecture

Gateway Configuration for Restricted Databases:

...

  • Maintains network security policies
  • Centralizes database connections
  • Enables audit logging
  • Supports DMZ architectures

Database-Specific Configuration

Platform Compatibility

DatabaseSyntax ExampleSpecial Considerations
SQL ServerSELECT TOP 10 * FROM TableUse TOP for limiting
SQLiteSELECT * FROM Table LIMIT 10Use LIMIT clause
MySQLSELECT * FROM \Table` LIMIT 10`Backticks for names
PostgreSQLSELECT * FROM "Table" LIMIT 10Case-sensitive names
OracleSELECT * FROM Table WHERE ROWNUM <= 10ROWNUM for limiting

Time Zone Management

Default Behavior:

  • Platform stores all DateTime values as UTC
  • Historian and Alarm data always in UTC
  • Automatic conversion for display

...

-- Adjust for local time zone (example: EST -5 hours)
WHERE Timestamp >= DATEADD(hour, -5, @Tag.StartTimeUTC)

Performance Optimization

Query Optimization Checklist

? Indexes: Ensure indexes on filtered and joined columns ? Statistics: Update database statistics regularly ? Query Plans: Review execution plans for bottlenecks ? Connection Pooling: Enable for frequent operations ? Batch Operations: Group multiple operations when possible

Performance Monitoring

Key Metrics to Track:

  • Query execution time
  • Memory consumption
  • Network latency
  • Connection pool usage
  • Cache hit rates

...

@Dataset.Query.Query1.Error        // Last error message
@Dataset.Query.Query1.ExecutionTime // Query duration
@Dataset.Query.Query1.RowCount      // Result size

Error Handling & Recovery

Error Detection

The module provides multiple error detection methods:

...

string status;
DataTable result = @Dataset.Query.Query1.SelectCommandWithStatusAsync(out status);
if (status != "OK") {
    // Handle error
}

Common Error Scenarios

Error TypeTypical CauseResolution
Connection TimeoutNetwork issues, server loadIncrease timeout, check connectivity
Syntax ErrorDatabase-specific SQLVerify syntax for target database
Permission DeniedInsufficient privilegesCheck database user permissions
DeadlockConcurrent transactionsImplement retry logic
Out of MemoryLarge result setAdd pagination, increase resources

Backup & Recovery

SQLite Backup Strategies

Option 1: Command Line Backup

...

  • Only when database is offline
  • Fastest for large databases
  • Requires downtime

Backup Best Practices

  1. Schedule: Automate backups during low-activity periods
  2. Verify: Test restore procedures monthly
  3. Rotate: Maintain multiple backup generations
  4. Secure: Store backups in separate physical location
  5. Document: Maintain restore procedure documentation

Store & Forward Limitations

The Store & Forward feature has specific requirements:

...

For Store & Forward configuration, see Historian Archiving Process documentation.

Best Practices Summary

Design Principles

  1. Isolation: Use dedicated query objects for different operations
  2. Filtering: Always limit result sets with WHERE clauses
  3. Security: Use parameterized queries exclusively
  4. Monitoring: Track performance metrics and errors
  5. Planning: Design for concurrent access from the start

Production Checklist

Before deploying to production:

  • [ ] Parameterized all dynamic queries
  • [ ] Implemented error handling for all operations
  • [ ] Tested concurrent access scenarios
  • [ ] Configured appropriate timeouts
  • [ ] Established backup procedures
  • [ ] Documented recovery processes
  • [ ] Verified timezone handling
  • [ ] Optimized query performance
  • [ ] Planned for data growth
  • [ ] Secured network access

Related Documentation

  • Datasets Engine (Reference)
  • Datasets Runtime Attributes
  • Historian Archiving Process
  • TServer Configuration Guide