I always wish I wouldn’t have to pull in python/pandas to do simple data transformations on smaller local datasets/csv files and stick with sqlite. But then I tend to quickly hit a problem that seems to require dynamic SQL. Anything in between SQL and a general programming language that is made for transforming tabular data?
tsql and plsql allow more flexibility than plain sql and are integrated in most modern platforms like postgre
honestly once you’re familiar enough with sql you can do whatever you need to do via a series of tables, but most people aren’t that skilled
typical enterprise tool would be informatica
Can I suggest duckdb? You can start out writing SQL directly on top of CSV and Parquet files. But then if you want/need to do something more complicated, you can
import duckdb
into Python, keep running the SQL you already wrote, convert it to a Pandas or Polars dataframe, transform it, then query the result in SQL.It depends on the data you have to work with but SQL is quite capable in itself. Yet SQL might be tricky for some specific tasks (like unwrapping dimensions from a plain table when you have to partially rotatate the table, building multidimensional datasets) and for those cases more traditional approaches tend to he far easier to grasp and use.
Like… Can you use it? Yes. Should you? If you’re highly skilled and proficient in SQL then sure, why not… But would’ve you asked the question in the first place if that was the case?
I don’t have to do such tasks often enough (once in a month or two) to be brothered and when I need to I’ll just smack my head against the table (ha-ha) until I make it (remember that part about being skilled in SQL - I’m not skilled enough lol)… Maybe I’ll polish the result with some script or a bit of Go. But that’s not an approach I’d recommend to use on regular basis.