R Sql Tableau United

rstu

R Sql Tableau United I’ve been using SQL Server for years, working a lot with Tableau more recently, and playing around with R because it interests me. Tableau works seamlessly with SQL and can connect to R as well, and now SQL 2016 has a built-in R server with lots of potential. So I thought about uniting all three of them: The first obstacle to overcome was the execution of basic R code in SQL Server. The installation is straightforward as part of SQL 2016 (https://msdn.microsoft.com/en-us/library/mt696069.aspx), but there’s an issue […]

Easy Transposing (Cross-Tabulation) of Any Relation Table

Database icons on green background. Vector illustration.

The challenge Sometime we need to transpose a table, that is, to turn unique values from one or more column in the table into multiple columns in the results. This operation might actually de-normalize the data in the relational table. For example, let’s refer to table Production.Product in the sample database AdventureWorks. Each ProductID belongs to a ProductLine, as shown in the sample below: Suppose we want to display the following result sets: A list in a row of all Name values per each ProductLine: A list in a row […]

Rebuilding a HEAP with nonaligned index

blue digital binary data on computer screen. Close-up shallow DOF

This post is designed to help you rebuild a HEAP with nonaligned indexes and without any issues. Starting in SQL Server 2008, you can rebuild an entire HEAP table using the command: ALTER TABLE tbl_Demo REBUILD Moreover, if the HEAP table is partitioned, you may specify a particular partition number to be rebuild as follows: ALTER TABLE tbl_Demo REBUILD PARTITION = n While n is the partition number being rebuild in order (read more about this command here). While performing a daily regular index maintenance operation to rebuild all indexes and […]

How to Optimize Your Optimization Plan

SEO optimized red stamp isolated on white background.

Database optimization plans are great. But the truth is that optimization plans must be optimized as well.The following maintenance plan for optimization was created at a client sites. You may notice that this is incorect. I thought I’d share this and explain how to correct it so you can avoid making the same mistake or correct your own maintenance plans.

Super Boost Data Loads with UNUSABLE

download transfer speed: speedometer and progress bar

Indexes are a great tool for performance tuning and often provide a drastic performance boost to our SQL statement by reducing the amount IO performed. However, indexes also have a performance penalty during DML operations (Insert / Update / Delete), and especially during heavy data loads to the table. This is due to index maintenance overhead, as each change in the table causes a corresponding change in the index. And the more indexes exist for the table, the more acute this overhead becomes. To speed up the data load it […]

Big Data Essentials – Part 2 – Big Data Vs. Relational Databases

Big Data Storage Information World Map Concept

In this video, we discuss the three Vs – “Volume, Velocity, Verity” which are the differentiating factors separating Big Data from traditional databases.

Big Data Essentials – Part 1 – What is Big Data

Big Data Storage Information World Map Concept

Big Data Essentials: In this video series we try and provide simple answers to the complex questions in the world of Big Data technologies.

The top 12 new features of Oracle 12c

Oracle

For this video, I have selected what I think are the top 12 new features of Oracle 12c: from the “game changing” Pluggable Database architecture to lesser known but extremely cool features such as improved top-N queries or VARCHAR.

TRUNCATE TABLE and REUSE STORAGE

497380

Were you ever required to empty a table in order to load fresh data into it? Many systems require refreshing data in a table at regular intervals, based on a query that brings data from other sources. In which case, emptying the target table and then load new data is required. The DELETE statement is the most used SQL statement for removing data from a table. However, it has a huge performance and overhead penalty for large data sets. This is because DELETE works row by row, and for each […]

Detecting Deadlocks in SQL Server (Part 2)

abstract 3d illustration of database server, over black background

In the first article in this series, we learned how the deadlock phenomenon is created and detected. We now know that determining the cause of, and preventing a deadlock event requires the intervention of the development or DBA team. SQL Server provides its users with a variety of tools that help investigate and solve the phenomenon. This post provides the steps to detecting the causes of a deadlock through SQL Server’s trace flags 1204 and 1222. Trace flags can be thought of as operating switches that control the behavior of […]

Detecting Deadlocks in SQL Server (Part 1)

DataBase

Deadlocks are events that occasionally appear in SQL Server-based applications. The challenge is locating the cause of the problem and applying the appropriate solution, which requires intervention by the development or DBA team. Investigating the root causes of a deadlock may be complex and at times even frustrating. To systematically solve this problem, one must be familiar with the tools offered by SQL Server and know how to use these tools effectively. In a series spanning several articles, I will show you how to use these tools and methods and how to […]

Using Oracle Flashback Archive

A multimedia internet server computer doing multimedia  processing sharing and calculating actvity

Starting with Oracle 9iR2 (remember the old days?), we had the capability to run Flashback Queries and to access “old” information from tables based on the changes to the database stores in UNDO. The principle is simple and uses Oracle’s Read Consistency mechanism. This is already a part of the database, and for each row that is updated in the DB, the previous image of the row is stored in the UNDO tablespace for the purpose of long running queries. The downside of relying on UNDO data for Flashback Query […]

Run an Oracle DataPump job using PL/SQL

set of cubes surrounded with luminous flows

Oracle’s DataPump technology allows easy import and export of data between databases. DataPump is most commonly used to enable DataPump export and DataPump import with the expdp and impdp utilities. To simplify this process, DataPump has a PL/SQL interface that can be used to run DataPump commands directly from the database. Many do so by opening a remote connection to the server and running the DataPump through the OS command prompt. However, this is not necessary. What many don’t know is that DataPump can be accessed directly from the database […]