mysql - How to optimise my sql query -
i have table user_notification. there's cron command running executes
select * `user_notification` `t` `t`.`status`=2;
where status 1 - send_fail, 2- queue , 3- success (like this.)
in table there around ~100k records increasing exponentially day day. query taking time. there way optimise query ?
#table structure table `user_notification` create table `user_notification` ( `id` int(11) not null, `user_id` int(11) not null, `notification_id` int(11) not null, `notification_title` varchar(256) not null, `notification_message` text not null, `status` int(1) not null, `created` int(11) not null, `updated` int(11) not null ) engine=innodb default charset=latin1; alter table `user_notification` add primary key (`id`), add key `user_id` (`user_id`), add key `notification_id` (`notification_id`); alter table `user_notification` modify `id` int(11) not null auto_increment, auto_increment=94322; alter table `user_notification` add constraint `user_notification_ibfk_1` foreign key (`user_id`) references `user` (`id`), add constraint `user_notification_ibfk_2` foreign key (`notification_id`) references `notification` (`id`); # query_time: 0.010663 lock_time: 0.000045 rows_sent: 0 rows_examined: 17294 set timestamp=1491527264; select * `user_notification` `t` `t`.`status`=2;
as others have mentioned, creating index on status
column should help.
it sounds going working relatively small subset of rows (that is, "queued" rows) in increasingly huge table. might want consider putting "queued" records in own table, , moving them history table status changes "success" or "failed". way, you're querying relatively small table. of course, strategy involves deletes , inserts, may cause other problems depending on how application works.
Comments
Post a Comment