I don’t write as much about what I do at my day job as I used to. I probably should, because it’s all still relavant to my older readers. One example of that is that I have published a lot of articles on MySQL and sqlite. However, more and more of my development efforts use native json. I’ve slowly fallen in love with mongodb for this reason – I can have a local database that is fast, that stores data in native json. That makes my transactions cheaper when it’s time to read and write for databases. If I have to regex to put information in sql and then take…
-
-
Simple sqlite3 Fuzzer
One of my favorite ways to find escape defects in code is to employ a generic fuzzer. I typically have 5-10 laptops running fuzzers for various projects at a time. I was recently doing some research on sqlite3 and so started to fuzz the implementation built into macOS. The fuzzer generates random SQL statements and executes them against a SQLite database file. If any errors are encountered, they will be printed to the console: There’s not much logic here. Add more complex tests to improve it. Like SQL grammar to generate valid SQL statements, or a genetic algorithm to evolve SQL statements that are more likely to find bugs. Use…
-
Simple Serialized SQL Inserts
Working with SQL is pretty straight forward. Once can easily parse xml, json, or just csv and insert values into new rows in a database. Here, we’ll insert two fields, user_name and role_name into a new row in a table called krypted: INSERT INTO krypted (user_name, role_name) VALUES (krypteduser, admingroup);
-
Convert a SQL Dump to Excel
The attached script can be used to migrate data out of a sqldump and into csv and xls for further analysis or ETL’ing. The name of each table is used to create a separate csv file (e.g. table1.csv) and an xls representation of each table is put into a workbook in a spreadsheet. The script requires Python 3.7. So if you’re on a Mac you might need to install that. Here, we use Homebrew to do so: brew install python3 Then you’d need to install the specific mods: python3 -m pip install pandas
-
Python Script To Move A SQL Database To .csv Files
You have a database, such as a mysql dump of a Jamf Pro server, or a sql dump of a WordPress site. You want to bring it into another tool or clean the data using a csv as an intermediary. Or you’re using an Amazon Glue job to ETL the data. The following script will take that sql dump and convert it into a bunch of csv files. To use the script, simply run it with $1 as the path to the sql file and $2 as the path to the export directory, as follows: python sqlcsvexport.py /sql_file_path /target_dir To download the script:
-
Show MySQL Settings
MySQL usually pulls settings from a my.cnf file. However, you can end up with settings in include files, which can be defined in the my.cnf using the following directives: include /home/mydir/myopt.cnf includedir /home/mydir Because of this, and the fact that you might not have access to all locations of .cnf files on a filesystem, you can also grab them using the SHOW VARIABLES option within SQL, obtained by /usr/local/mysql/bin/mysql -uroot -p mypassword -e "SHOW VARIABLES;" > /tmp/SQLSettings.txt In the above command, -uroot defines we’ll be accessing with the root user, -p defines the password (listed as mypassword) and the -e defines that we want to execute a command and then…
-
Mapping New SQLite Databases
I’ve written about SQLite databases here and there over the years. A number of Apple tools and third party tools for the platform run on SQLite and it’s usually a pretty straight forward process to get into a database and inspect what’s there and how you might programmatically interact with tools that store data in SQLite. And I’ll frequently use a tool like Navicat to quickly and visually hop in and look at what happens when I edit data that then gets committed to the database. But I don’t always have tools like that around. So when I want to inspect new databases, or at least those new to me,…
-
Use the Slow Query Log in MySQL
Slow queries can cause poor database performance. MySQL has a slow query log, which lets you log queries that take too long (with too long a user-configurable setting). This allows you to quickly find craptastic queries in your logs. To enable the slow query log, use SET GLOBAL and then set the slow_query_log option to ON: SET GLOBAL slow_query_log = 'ON'; By default, the slow query log identifies any query that takes 10 seconds or longer. To change long_query_time using SET GLOBAL. Here, we’ll set it to 7 seconds: SET GLOBAL long_query_time = 7; You can also set the log file to something other than the default location of /var/lib/mysql/hostname-slow.log.…
-
Use SQL Views to Grant Access and Constrain Output
You can grant access to certain columns to view in SQL without providing access to specific users to see the whole database. This is pretty useful when delegating reporting to users, without giving them access to all of the data in your database. For example, a user might be able to see a column with an address, but not a column with a credit card number, increasing database security while allowing you to delegate certain tasks when appropriate. In this article, we’ll use the same “Customers” table from our first articles, signupdate: ID Site Contact Address City Zip Country SignupDate 1 Krypted Charles Edge my house Minneapolis 55418 US 2005-01-01 2…
-
Index Those SQL Tables
If you have growing sets of data, one of the best ways to speed up database performance is to make sure each column in larger tables is indexed. You can easily index a column following this syntax (using the name of your table in the place of tablename and the name of your column in the place of columnname): ALTER TABLE tablename ADD INDEX (columnname); So if you have a table called Customers and the following columns (as in the case of my Customers database from the previous exercises): ID Site Contact Address City Zip Country You would index them all as follows: ALTER TABLE Customers ADD INDEX (ID); ALTER…