One of my favorite ways to find escape defects in code is to employ a generic fuzzer. I typically have 5-10 laptops running fuzzers for various projects at a time. I was recently doing some research on sqlite3 and so started to fuzz the implementation built into macOS. The fuzzer generates random SQL statements and executes them against a SQLite database file. If any errors are encountered, they will be printed to the console: There’s not much logic here. Add more complex tests to improve it. Like SQL grammar to generate valid SQL statements, or a genetic algorithm to evolve SQL statements that are more likely to find bugs. Use…
-
-
The Importance of Retaining Time Series Data
We need to keep all of our data. And we need to keep it in multiple dimensions. These days, this refers to planning to stream data into a centralized repository, or warehouse. And the next step is to maintain that data in a fashion where it’s serialized by time. Time series refers to data points in a series of data that are indexed and ordered at specific times. For example, a snapshot of tables in a database exported every few hours. It can also be defined as data recorded in a sequence, measuring a similar specific setting over time, in order. Such as when you have data streaming a repository…
-
Convert a SQL Dump to Excel
The attached script can be used to migrate data out of a sqldump and into csv and xls for further analysis or ETL’ing. The name of each table is used to create a separate csv file (e.g. table1.csv) and an xls representation of each table is put into a workbook in a spreadsheet. The script requires Python 3.7. So if you’re on a Mac you might need to install that. Here, we use Homebrew to do so: brew install python3 Then you’d need to install the specific mods: python3 -m pip install pandas
-
Python Script To Move A SQL Database To .csv Files
You have a database, such as a mysql dump of a Jamf Pro server, or a sql dump of a WordPress site. You want to bring it into another tool or clean the data using a csv as an intermediary. Or you’re using an Amazon Glue job to ETL the data. The following script will take that sql dump and convert it into a bunch of csv files. To use the script, simply run it with $1 as the path to the sql file and $2 as the path to the export directory, as follows: python sqlcsvexport.py /sql_file_path /target_dir To download the script:
-
mysql command switches
Mysql command options: –auto-rehash Enable or disable automatic rehashing –auto-vertical-output Automatically display output of commands vertically rather than horizontally –batch Do not write to the history file –binary I use this when I have large results timing out due to blobs being in there –bind-address Specify a network interface when connecting to the server –character-sets-dir Indicate a custom directory where character sets are installed –column-names Output column names when running queries –column-type-info Show metadata –comments Enable or disable commenting when sending statements to the server –compress Compresses data sent to/from client and server –connect_timeout Seconds of inactivity before a timeout –database The database to connect to –debug Write debugging log; supported only if MySQL was built with debugging support –debug-check Print debugging information…
-
Use the Slow Query Log in MySQL
Slow queries can cause poor database performance. MySQL has a slow query log, which lets you log queries that take too long (with too long a user-configurable setting). This allows you to quickly find craptastic queries in your logs. To enable the slow query log, use SET GLOBAL and then set the slow_query_log option to ON: SET GLOBAL slow_query_log = 'ON'; By default, the slow query log identifies any query that takes 10 seconds or longer. To change long_query_time using SET GLOBAL. Here, we’ll set it to 7 seconds: SET GLOBAL long_query_time = 7; You can also set the log file to something other than the default location of /var/lib/mysql/hostname-slow.log.…
-
Use SQL Views to Grant Access and Constrain Output
You can grant access to certain columns to view in SQL without providing access to specific users to see the whole database. This is pretty useful when delegating reporting to users, without giving them access to all of the data in your database. For example, a user might be able to see a column with an address, but not a column with a credit card number, increasing database security while allowing you to delegate certain tasks when appropriate. In this article, we’ll use the same “Customers” table from our first articles, signupdate: ID Site Contact Address City Zip Country SignupDate 1 Krypted Charles Edge my house Minneapolis 55418 US 2005-01-01 2…
-
Index Those SQL Tables
If you have growing sets of data, one of the best ways to speed up database performance is to make sure each column in larger tables is indexed. You can easily index a column following this syntax (using the name of your table in the place of tablename and the name of your column in the place of columnname): ALTER TABLE tablename ADD INDEX (columnname); So if you have a table called Customers and the following columns (as in the case of my Customers database from the previous exercises): ID Site Contact Address City Zip Country You would index them all as follows: ALTER TABLE Customers ADD INDEX (ID); ALTER…
-
Manage SQL Schemas with Flyway
Flyway is a tool that allows you to perform version-controlled management of SQL schemas. You can download flyway at http://flywaydb.org. It doesn’t require a build, so once downloaded, drop it where you want it to live and you can just summon the binary in scripts. Once installed, the following is the basic structure of commands for Flyway: flyway [options] command By default, the configuration will be read from conf/flyway.conf. Options passed from the command-line override the configuration. Commands ======== migrate : Migrates the database clean : Drops all objects in the configured schemas info : Prints the information about applied, current and pending migrations validate : Validates the applied migrations…
-
Install MySQL on Linux
Installing MySQL on Linux is pretty easy. You can use yum (or your favorite package manager for most installs. Here, we’ll pull a list of packages from yum using repolist: yum repolist enabled | grep "mysql.*-community.*" You’ll then get a list of community edition MySQL packages that are available. Then let’s say you’re installing on RHEL 6, we’ll pull a string from the repolist of an appropriate package and then do a localinstall of it: sudo yum localinstall mysql57-community-release-el6-157.noarch.rpm We could also grab mysql and all the other stuffs we want to have with it: yum install mysql mysql-server mysql-libs mysql-server And then start it up: service mysql start