项目由于插入数据量比较大,导致前端请求超时,查看系统代码发现是通过循环单条插入,因此准备先从输入批量写入开始优化,通过查询资料发现有几种方式:
底层其实是单条插入
@Transactional( rollbackFor = {Exception.class} ) public boolean saveBatch(CollectionentityList, int batchSize) { String sqlStatement = this.getSqlStatement(SqlMethod.INSERT_ONE); return this.executeBatch(entityList, batchSize, (sqlSession, entity) -> { sqlSession.insert(sqlStatement, entity); }); }
缺点是每个表都要手动编写xml,优点是效率较高
// mapper.xmlinsert into user (id, name, age) values (#{user.id}, #{user.name}, #{user.age})
底层也是拼接sql,但无需手动编写sql语句,效率同第二种,本文重点介绍这种方式,使用步骤:
public class MySqlInjector extends DefaultSqlInjector { @Override public ListgetMethodList(Class> mapperClass, TableInfo tableInfo) { List methodList = super.getMethodList(mapperClass, tableInfo); methodList.add(new InsertBatchSomeColumn(i -> i.getFieldFill() != FieldFill.UPDATE)); return methodList; } }
@Configuration public class MybatisPlusConfig { @Bean public MySqlInjector sqlInjector() { return new MySqlInjector(); } }
public interface MyBaseMapperextends BaseMapper { /** * 以下定义的 4个 method 其中 3 个是内置的选装件 */ int insertBatchSomeColumn(List entityList); }
@Mapper public interface StudentMapper extends MyBaseMapper{ @SelectProvider(value = MySelectProvider.class, method = "getSql") Student select(String sql); }
不出意外,组装后的sql语句应该是下面这样
INSERT INTO student (id,name,age) VALUES (?,?,?) , (?,?,?) , (?,?,?)
正常mysql数据库,做到这一步就完成了。项目采用的是Mybatis Plus3.4.2,作者被人也在注释标注了只在mysql做过测试,但是Oracle数据库的批量插入语句语法和Mysql不同,所以需要修改SQL
先了解下,Oracle批量插入语句
INSERT ALL INTO emp (empno, ename) VALUES (1001, '张三') INTO emp (empno, ename) VALUES (1002, '李四') INTO emp (empno, ename) VALUES (1003, '王五') SELECT * FROM dual;
因此我们需要把SQL组装成这种结构,查看InsertBatchSomeColumn类,可以发现SQL组装逻辑在injectMappedStatement方法,因此我们模仿InsertBatchSomeColumn类,编写SQL组装逻辑
@NoArgsConstructor @AllArgsConstructor @SuppressWarnings("serial") public class MyInsertBatchSomeColumn extends InsertBatchSomeColumn { @Setter @Accessors(chain = true) private Predicatepredicate; private final String INSERT_BATCH_SQL=""; @SuppressWarnings("Duplicates") @Override public MappedStatement injectMappedStatement(Class> mapperClass, Class> modelClass, TableInfo tableInfo) { //pojo类型为Map时禁用 if (tableInfo.getEntityType().equals(Map.class)) { return null; } //return super.injectMappedStatement(mapperClass,modelClass,tableInfo); KeyGenerator keyGenerator = new NoKeyGenerator(); SqlMethod sqlMethod = SqlMethod.INSERT_ONE; List fieldList = tableInfo.getFieldList(); String insertSqlColumn = tableInfo.getKeyInsertSqlColumn(false) + this.filterTableFieldInfo(fieldList, predicate, TableFieldInfo::getInsertSqlColumn, EMPTY); String columns = insertSqlColumn.substring(0, insertSqlColumn.length() - 1) ; String insertSqlProperty = tableInfo.getKeyInsertSqlProperty(ENTITY_DOT, false) + this.filterTableFieldInfo(fieldList, predicate, i -> i.getInsertSqlProperty(ENTITY_DOT), EMPTY); insertSqlProperty = LEFT_BRACKET + insertSqlProperty.substring(0, insertSqlProperty.length() - 1) + RIGHT_BRACKET; String valuesScript = convertForeach(insertSqlProperty, "list", tableInfo.getTableName(),columns, ENTITY, NEWLINE); String keyProperty = null; String keyColumn = null; // 表包含主键处理逻辑,如果不包含主键当普通字段处理 if (tableInfo.havePK()) { if (tableInfo.getIdType() == IdType.AUTO) { /* 自增主键 */ keyGenerator = new Jdbc3KeyGenerator(); keyProperty = tableInfo.getKeyProperty(); keyColumn = tableInfo.getKeyColumn(); } else { if (null != tableInfo.getKeySequence()) { keyGenerator = TableInfoHelper.genKeyGenerator(getMethod(sqlMethod), tableInfo, builderAssistant); keyProperty = tableInfo.getKeyProperty(); keyColumn = tableInfo.getKeyColumn(); } } } String sql = String.format(INSERT_BATCH_SQL, valuesScript); SqlSource sqlSource = languageDriver.createSqlSource(configuration, sql, modelClass); return this.addInsertMappedStatement(mapperClass, modelClass, getMethod(sqlMethod), sqlSource, keyGenerator, keyProperty, keyColumn); } public static String convertForeach(final String sqlScript, final String collection, final String tableName,final String columns, final String item, final String separator) { StringBuilder sb = new StringBuilder(" sb.append(" collection=\"").append(collection).append("\""); } if (StringUtils.isNotBlank(item)) { sb.append(" item=\"").append(item).append("\""); } if (StringUtils.isNotBlank(separator)) { sb.append(" separator=\"").append(separator).append("\""); } sb.append(">").append("\n"); if (StringUtils.isNotBlank(tableName)) { sb.append(" INTO ").append(tableName).append(" "); } if (StringUtils.isNotBlank(columns)) { sb.append(LEFT_BRACKET).append(columns).append(RIGHT_BRACKET).append(" VALUES "); } return sb.append(sqlScript).append("\n").append(" \n").append(" SELECT ").append("*").append(" FROM dual").toString(); } }
修改3.1中方法
public class MySqlInjector extends DefaultSqlInjector { @Override public ListgetMethodList(Class> mapperClass, TableInfo tableInfo) { List methodList = super.getMethodList(mapperClass, tableInfo); //这里改成我们自己的实现MyInsertBatchSomeColumn methodList.add(new MyInsertBatchSomeColumn(i -> i.getFieldFill() != FieldFill.UPDATE)); return methodList; } }
到此,我们就完成了Oracle的批量插入,代码改动较小,下一步如果速度还是不满足要求,考虑使用多线程来优化。
经过实际测试,MybatisPlus的InsertBatchSomeColumn插件进行批量插入时并不会自动分组插入,而是你传入多少数据就全部拼接成一整条SQL,对于表结构字段多,数据量大的操作效率反而不如单条插入,原因是SQL过长虽然避免了多次网络IO,但是数据库解析SQL效率降低,因此需要对数据进行分批插入,选择合适的批次大小才能提升效率。不同表分的批次应该不同,因为字段数量不同;字段数量少的批次可以适当放大,字段数量多的则批次减少。
上一篇:go语言入门详细教程