Connect Teradata using Python pyodbc – Steps and example

Teradata is one of the widely used MPP databases. Teradata provides many connectors such as its own Python connector Teradata, Teradatasql, etc. You can use those to interact with the Teradata database. There are many different options to connect to Teradata. Teradata provides support for JDBC and ODBC drivers. In this article, we will check method on connect Teradata using the Python pyodbc module and odbc driver with a working example. Teradata ODBC Driver Being a popular MPP database, Teradata database server comes with ODBC support. Before attempting to connect…

Continue ReadingConnect Teradata using Python pyodbc – Steps and example
Comments Off on Connect Teradata using Python pyodbc – Steps and example

Optimize Snowflake Table Structure to Improve Performance

The performance of the Snowflake cloud data warehouse is directly dependent on the optimal table structure and design. Optimizing Snowflake table structure is one of the important aspect to improve the performance of query, data loading and unloading process. In this article, we will check how to optimize the Snowflake table structure to improve query performance. Optimize Snowflake Table Structure There are no specific best practices that you can apply to optimize the table structure. Creating optimal table structure that uses right data type and length is one of the…

Continue ReadingOptimize Snowflake Table Structure to Improve Performance
Comments Off on Optimize Snowflake Table Structure to Improve Performance

Snowflake Transient Tables, Usage and Examples

Snowflake Transient tables are similar to permanent tables with the key difference that they do not have a Fail-safe period. The transient tables are similar to temporary tables, but, you have to explicitly drop transient tables at the end of the session. Snowflake Transient Tables Snowflake transient tables persist until explicitly dropped and are available to all users with the appropriate privileges. The transient tables are designed for transitory data that needs to be maintained beyond the current session. Because transient tables do not have a Fail-safe period, they provide…

Continue ReadingSnowflake Transient Tables, Usage and Examples
Comments Off on Snowflake Transient Tables, Usage and Examples

Snowflake Temporary Tables, Usage and Examples

Similar to other relational databases, Snowflake support creating temp or temporary tables to hold non-permanent data. i.e. Data which is used in the current session. In this article, we will check how to create Snowflake temp tables, syntax, usage and restrictions with some examples. Snowflake Temporary Tables The temporary table in Snowflake is visible only within the current session. Temporary tables only exist within the session in which they were created and persist only for the remainder of the session. Once the session ends, the system will purge the data…

Continue ReadingSnowflake Temporary Tables, Usage and Examples
Comments Off on Snowflake Temporary Tables, Usage and Examples

Snowflake Load Local CSV File using COPY and Example

Many organizations use flat files such as CSV or TSV files to offload large tables. Managing flat files such as CSV is easy and it can be transported by any electronic medium. In this article, we will check how to load or import local CSV file into Snowflake using COPY command with some examples. Load Local CSV File using Snowflake COPY Command There are a couple of methods that you can use to load a csv file present in your local system. Following are the methods. Use SnowSQL command Line…

Continue ReadingSnowflake Load Local CSV File using COPY and Example
Comments Off on Snowflake Load Local CSV File using COPY and Example

Snowflake Type of Subqueries and Examples

In general, subquery in a database is a nested query block in a query statement. It is simply a SELECT expression enclosed in a parenthesis. The Subquery may return zero to one or more values to its upper or parent SELECT statements. In this article, we will check Snowflake type of subqueries with an examples. Snowflake Subqueries A subquery in Snowflake is a nested select statement, that return zero or more records to is upper select statement. The outer SELECT statement that contains subquery is sometimes referred to as a…

Continue ReadingSnowflake Type of Subqueries and Examples
Comments Off on Snowflake Type of Subqueries and Examples

Snowflake Fixed-Width File Loading Options and Examples

Fixed width text files are special cases of text files where the format is specified by column widths, pad character and left or right alignment. Many telecom companies use fixed-width file to store call detail records (CDR) data. In this format, column width are in terms of units of characters. In this article, we will learn about Snowflake Fixed-Width file loading options and examples. Snowflake Fixed-Width File Loading The fixed-width data files have uniform lengths for each column of data. Each field in a fixed-width data file has exactly the…

Continue ReadingSnowflake Fixed-Width File Loading Options and Examples
Comments Off on Snowflake Fixed-Width File Loading Options and Examples

Spark SQL CASE WHEN on DataFrame – Examples

In general, the CASE expression or command is a conditional expression, similar to if-then-else statements found in other languages. Spark SQL supports almost all features that are available in Apace Hive. One of such a features is CASE statement. In this article, how to use CASE WHEN and OTHERWISE statement on a Spark SQL DataFrame. Spark SQL CASE WHEN on DataFrame The CASE WHEN and OTHERWISE function or statement tests whether any of a sequence of expressions is true, and returns a corresponding result for the first true expression. Spark…

Continue ReadingSpark SQL CASE WHEN on DataFrame – Examples
Comments Off on Spark SQL CASE WHEN on DataFrame – Examples

Import CSV file to Pyspark DataFrame – Example

Many organization uses a flat file format such as CSV or TSV to offload their tables. Managing flat file is easy and can be transported by any electronic medium. In this article we will check how to import CSV file to Pyspark DataFrame with some examples. Import CSV file to Pyspark DataFrame There are many methods that you can use to import CSV file into pyspark or Spark DataFrame. But, the following methods are easy to use. Read Local CSV using com.databricks.spark.csv FormatRun Spark SQL Query to Create Spark DataFrame…

Continue ReadingImport CSV file to Pyspark DataFrame – Example
Comments Off on Import CSV file to Pyspark DataFrame – Example

Spark SQL Date and Timestamp Functions and Examples

Spark SQL provides many built-in functions. The functions such as date and time functions are useful when you are working with DataFrame which stores date and time type values. The built-in functions also support type conversion functions that you can use to format the date or time type. In this article, we will check what are Spark SQL date and timestamp functions with some examples. Spark SQL Date and Timestamp Functions Spark SQL supports almost all date and time functions that are supported in Apache Hive. You can use these…

Continue ReadingSpark SQL Date and Timestamp Functions and Examples
Comments Off on Spark SQL Date and Timestamp Functions and Examples