Skip to main content
Filter by
Sorted by
Tagged with
0 votes
1 answer
113 views

My query is taking too long to run when I use general filters, but it's fine with specific filters. I suspect it's an optimization issue. To fix it, I'm trying to create a non-clustered index on a new ...
nexus-4's user avatar
  • 13
0 votes
0 answers
68 views

Optimizing MySQL LIKE Query with Pattern Matching for Large Dataset (20M+ records) I'm struggling with performance issues in MySQL while searching through a large table containing over 20 million ...
Aryan Gupta's user avatar
1 vote
1 answer
42 views

I’m working on a project that involves managing large time-series datasets in GridDB. Currently, I’m experiencing significant query latency issues as the dataset size grows. While GridDB performs well ...
Abel Mesfin's user avatar
1 vote
1 answer
120 views

I'm designing a postgres database in which there is a table that has a jsonb type column. I would like this column to be unique. There is no need to have two objects with the exact same json ...
hasdrubal's user avatar
  • 1,168
-1 votes
2 answers
56 views

In order to store the country information for a person I did: CREATE TABLE test ( id INT IDENTITY(1, 1), name VARCHAR(100) NOT NULL, country VARCHAR(100) NOT NULL, ...
adragomir's user avatar
  • 669
-1 votes
1 answer
401 views

I have a parsed representation of an SQL query in the form of an abstract syntax tree. I'd like to convert it into an intermediate logical tree representation, which can in turn be expanded into ...
Kathandrax's user avatar
  • 1,054
-1 votes
1 answer
50 views

Using Prisma and PostgreSQL: model Chat { id Int @id @default(autoincrement()) Users UserChat[] @relation(fields: [], references: []) Keywords ChatKeyword[] chatName ...
codingcanbefun's user avatar
0 votes
0 answers
98 views

I've got a pretty complex select query, but it's otherwise optimized based on the EXPLAIN. I want to create a concrete table from the results of this select. But the challenge is that this is a large ...
kenshin9's user avatar
  • 2,395
0 votes
0 answers
438 views

I have the following SQL script to use for our Microsoft Dynamics NAV Database (MS SQL Server). It looks for missing indexes in the database. We have the problem that the script also finds some tables ...
disnamedyna's user avatar
1 vote
0 answers
30 views

In MySQL, it is possible to partition a table based on a timestamp column using the PARTITION BY RANGE clause in the CREATE TABLE statement. Can a similar approach be used in GridDB to partition a ...
Ammar Ubaid 's user avatar
0 votes
1 answer
218 views

The situation I am having is the following. I have a transactions database table. I have both an incremental id for each transaction as well as a unique ID to use for front-end purposes. An example ID ...
Mihail Minkov's user avatar
0 votes
0 answers
346 views

I have close to 200M records in my tables. My migration script runs very slowly (for hours). ---My OLD Tables--- orders_old table 30M row count id createdDate status price 1000376453 2021-10-14 ...
Oguzhan Cevik's user avatar
1 vote
1 answer
94 views

Task: I want to get all invoices where the project title matches a given string. The Invoice has a foreign key to Project. The problem: I want to use a function for doing the project search so I can ...
Ron's user avatar
  • 23.6k
1 vote
0 answers
551 views

So i'm in the middle of an issue right now where the page loading speed is not up to par to what we're looking for and i'm honestly out of ideas for now and i hoped maybe someone smarter can lead me ...
ygnsl's user avatar
  • 31
0 votes
1 answer
67 views

Say I have SQLite table with the following records: recID productID productName 1 1 Product A 2 2 Product B 3 2 Product C 4 3 Product D 5 3 Product D recID = primary key, auto increment. If I run: ...
Albert Tobing's user avatar
5 votes
1 answer
4k views

I would like to speed up the queries on my big table that contains lots of old data. I have a table named post that has the date column created_at. The table has over ~31 million rows and ~30 million ...
user3281975's user avatar
1 vote
1 answer
4k views

I have a Woocommerce site with around 10K products but got a 5.1 GiB database size and post_meta only occupy 4.5 GiB but the wp_posts table is 350 MB only. I have tried the following query but still ...
Chandra Shekhar Pandey's user avatar
-1 votes
2 answers
102 views

I am trying to improve the performance of a query using a "materialized view" to optimize away joins. The first query below is the original, which employs joins. The second is the query ...
CpnAhab's user avatar
  • 169
0 votes
1 answer
253 views

In case of a JOIN, especially in a one to many relationship, the result set will very often contain a lot of duplicate information in the result. For example, TABLE_A_ID TABLE_A_FIELD_ONE ...
f.khantsis's user avatar
  • 3,610
4 votes
3 answers
6k views

I got a Postgres database with multiple schemas. I'm trying to optimise my database tables with optimal data types. more often I end with the error cannot alter the type of a column used by a view ...
Sridas D's user avatar
1 vote
1 answer
138 views

I am using PostgreSQL 10 + pg_trgm extension. Table layout: Column | Type | Collation | Nullable | Default | Storage | --------------+-------------------+-----------+...
Xlv's user avatar
  • 55
1 vote
1 answer
3k views

I'm migrating my Postgres database and am attempting to update a string value to a numeric value, like this: UPDATE table SET column = 1 WHERE LENGTH(column) = 1; This table contains around 20 ...
Californium's user avatar
1 vote
1 answer
151 views

I currently have four tables that need to be joined with a left join. There are about 400_000 (400 thousand) of data records (which will grow in the future). In addition to the left join, the ...
lublak's user avatar
  • 45
0 votes
1 answer
465 views

We are converting DB2 procs over to SQL Server using the Microsoft SQL server migration assistant, and getting below error in the generated SQL Server proc: Errors:DB22SS0245 The conversion of ...
Sobin George's user avatar
0 votes
1 answer
43 views

Here is the query which I am trying to execute DELETE FROM testmachine WHERE workdone != 0 AND timetaken < 1617215400 LIMIT 1000; It is taking more than 50 sec to execute the Query. I want to ...
Ritika's user avatar
  • 16
1 vote
1 answer
2k views

and I appreciate in advance for your help on this. I have a VPS with the following specs: OS: Centos 7.x CPU Model: Common KVM processor CPU Details: 6 Core(2200 MHz) Distro Name: CentOS Linux ...
ALE TAU's user avatar
  • 23
-2 votes
1 answer
632 views

I have a large database with login and level columns. All queries were slow before I created indexes on these columns: CREATE INDEX users_login_index on users(login); CREATE INDEX users_level_index on ...
zenno2's user avatar
  • 453
0 votes
1 answer
204 views

is it possible for a function to return multiple distinct types? Example: I have following tables: games, games_rounds and games_players Now to load a full game by it's id i need to first load it from ...
Dr.ink's user avatar
  • 3
5 votes
1 answer
7k views

Though it may sound like a stupid question, sometimes it is necessary to show page numbers (and also the last page). What is the best way to calculate total row counts and also calculate page numbers (...
Sakibur Rahman's user avatar
0 votes
3 answers
1k views

I have a table with millions of rows: CREATE TABLE [dbo].[RequestIdentities] ( [Id] [bigint] IDENTITY(1,1) NOT NULL, [UniqueKey] [nvarchar](256) NULL, [Timestamp] [datetime] NULL, ...
George's user avatar
  • 95
0 votes
2 answers
72 views

Which is preferred for mapping values: reusing mapping functions OR building reference tables to do lookups? This is a very general high-level question, which, I believe is mostly language-independent....
Jayden.Cameron's user avatar
-1 votes
3 answers
115 views

I'm working on a project for university. Do you guys know what kind of algorithms i could implement that would help with the proper design and general performance of a database? Until now i came up ...
Olteanu Radu's user avatar
0 votes
1 answer
84 views

I have a query similar to: SELECT ANY_VALUE(name) AS `name`, 100 * SUM(score) / SUM(sum(score)) OVER (PARTITION BY date(scores.created_at)) AS `average_score`, ANY_VALUE(DATE_FORMAT(...
Shiv's user avatar
  • 851
0 votes
1 answer
912 views

I have an select query with about 1M records, I'm working on Magento 1.9 database. SELECT IF(sup_ap.is_percent = 1, TRUNCATE(mt.value + (mt.value * sup_ap.pricing_value / 100), 4), mt....
HoangHieu's user avatar
  • 2,840
0 votes
1 answer
443 views

The production database at my company is running significantly slower than the test database (local ~5ms, test ~18ms, production ~1-2 sec). We've been trying to look into why and will be doing some ...
datadumn's user avatar
1 vote
1 answer
112 views

I have created view by joining multiple tables to avoid loading time on dynamic query. but its giving the same result by taking more time. My system configuration is : 8GB RAM i5 Cpu and running ...
Belgium Diamonds's user avatar
0 votes
1 answer
501 views

I have a MySQL database. I have a table in it which has around 200000 rows. I am querying through this table to fetch the latest data.Query select * from `db`.`Data` where floor = "floor_value"...
chink's user avatar
  • 1,661
3 votes
3 answers
342 views

Supposing I had a product database of some 50,000 products supplying data to a back end system and a website, some are live, some are archived and some are “switched off” as far as the website is ...
Jamie Hartnoll's user avatar
0 votes
1 answer
899 views

How can I filter queries by date to prevent massive sequential scan on a large database? My survey app collects responses and each answer to a question in a survey is stored in a table ...
Jordan's user avatar
  • 4,933
0 votes
3 answers
2k views

I have a table (innodb) with 1 million new inserts (20GB) a week. I only need the data for 1 week, so I delete it after 7 days, so each day we delete around 3GB and insert 3GB new. That table is ...
Developer Jano's user avatar
5 votes
1 answer
836 views

I am studying databases from the book Fundamentals of Database Systems, from authors Elmasri and Navathe, 5th edition, and they explain briefly external sort using merge sort in almost at the ...
Ronald Becerra's user avatar
0 votes
1 answer
828 views

I'm trying to identify some performance bottlenecks in my Postgres queries and ran an EXPLAIN ANALYZE on a query to get some insights. The output of the query analysis is below: Nested Loop (...
TheMethod's user avatar
  • 2,998
-1 votes
1 answer
4k views

I have around 400 fields in my collection (including both at top level as well as embedded), following is the nature of write queries: All write queries always update single document and an average of ...
Punit Goel's user avatar
1 vote
1 answer
97 views

I'm using Entity Framework Core and Reading some data with thousands of records. Each records has many columns, of that columns I'm using 3 (a, b and c) and doing: OrderBy(a).ThenBy(b).ThenBy(c); ...
Erre Efe's user avatar
  • 15.6k
0 votes
1 answer
872 views

I have a table with 320 million+ rows and 34 columns, all of varchar(max) datatype, and with no indexing. I am finding it extremely time consuming to summarize the whole table. Can anyone suggest ...
Shahbaz Khan's user avatar
-1 votes
1 answer
144 views

Our Database has a very large table with several million rows of data. Some old code was written from a naive standpoint by myself several years ago, and doesn't handle for poor database performance. ...
Scuba Steve's user avatar
  • 1,659
-1 votes
1 answer
120 views

I have statements like this that are timing out: SELECT COUNT(*) FROM A WHERE A.value1 IN ( SELECT A.value1 FROM A WHERE A.value2 = 0 ) Table A has 13,000,000+ rows in it, and because of some ...
world hello's user avatar
-2 votes
2 answers
48 views

We are using MySQL InnoDB. We have a query looks like this. In our live environment, this query took more than 30 seconds to complete. select count(*) as aggregate from `parents` where ...
Gab's user avatar
  • 1
0 votes
0 answers
137 views

We need a database profiler for our application which shows the full detail of each query and suggest for indexes etc I found is there any free tool available like Postgres Enterpirse Manager?
Asnad Atta's user avatar
  • 4,013
0 votes
2 answers
224 views

Hi I am Trying to Optimize this Query. If there are a lot of transactions in the timeframe it can take up to 10 Seconds to execute on my local environment. I tried to create a Index on the created_at ...
Julian's user avatar
  • 25

1
2 3 4 5