Cursor based Relay-style pagination in GraphQL-ruby
Cursor-based pagination is a more efficient alternative to offset-based pagination, especially for large datasets. GraphQL-ruby supports Relay-style cursor-based pagination, which provides a standardized way to paginate through data.
Why Cursor-Based Pagination?
Cursor-based pagination offers several advantages:
- Performance: More efficient than offset-based pagination
- Consistency: Handles data changes better (no skipped/duplicate items)
- Scalability: Works well with large datasets
- Standard: Follows Relay specification
Setting Up Connection Types
Enable Connections
GraphQL-ruby provides connection types out of the box:
# app/graphql/types/base_object.rb
module Types
class BaseObject < GraphQL::Schema::Object
# Connections are enabled by default
end
end
Define Your Type
Create your base type:
# app/graphql/types/post_type.rb
module Types
class PostType < Types::BaseObject
field :id, ID, null: false
field :title, String, null: false
field :content, String, null: false
field :created_at, GraphQL::Types::ISO8601DateTime, null: false
end
end
Implementing Pagination
Basic Connection
Add a connection field to your query:
# app/graphql/types/query_type.rb
module Types
class QueryType < Types::BaseObject
field :posts, Types::PostType.connection_type, null: false
end
end
Custom Resolver
Implement pagination logic:
# app/graphql/types/query_type.rb
module Types
class QueryType < Types::BaseObject
field :posts, Types::PostType.connection_type, null: false do
argument :after, String, required: false
argument :first, Integer, required: false, default_value: 20
end
def posts(after: nil, first: 20)
relation = Post.order(created_at: :desc, id: :desc)
if after
cursor = decode_cursor(after)
relation = relation.where(
'(created_at, id) < (?, ?)',
cursor[:created_at],
cursor[:id]
)
end
relation.limit(first)
end
private
def decode_cursor(cursor)
decoded = Base64.urlsafe_decode64(cursor)
created_at, id = decoded.split(':')
{
created_at: Time.parse(created_at),
id: id.to_i
}
rescue
nil
end
end
end
Using Connection Types
Query Structure
Query using connection fields:
query GetPosts($first: Int, $after: String) {
posts(first: $first, after: $after) {
edges {
node {
id
title
content
createdAt
}
cursor
}
pageInfo {
hasNextPage
hasPreviousPage
startCursor
endCursor
}
}
}
Query Variables
{
"first": 10,
"after": "eyJjcmVhdGVkX2F0IjoiMjAyMC0wMS0wMVQwMDowMDowMFoiLCJpZCI6MTIzfQ=="
}
Cursor Encoding
Encode Cursors
Create a helper to encode cursors:
# app/graphql/concerns/cursorable.rb
module Cursorable
extend ActiveSupport::Concern
def encode_cursor(record)
timestamp = record.created_at.iso8601
Base64.urlsafe_encode64("#{timestamp}:#{record.id}")
end
def decode_cursor(cursor)
return nil unless cursor
decoded = Base64.urlsafe_decode64(cursor)
created_at, id = decoded.split(':')
{
created_at: Time.parse(created_at),
id: id.to_i
}
rescue
nil
end
end
Advanced Implementation
Custom Connection Class
Create a custom connection class for more control:
# app/graphql/types/post_connection.rb
module Types
class PostConnection < Types::BaseConnection
field :total_count, Integer, null: false
def total_count
object.nodes.size
end
end
end
# Use in query
field :posts, Types::PostConnection, null: false
Filtering and Sorting
Add filtering to pagination:
field :posts, Types::PostType.connection_type, null: false do
argument :after, String, required: false
argument :first, Integer, required: false, default_value: 20
argument :status, String, required: false
end
def posts(after: nil, first: 20, status: nil)
relation = Post.order(created_at: :desc, id: :desc)
relation = relation.where(status: status) if status
if after
cursor = decode_cursor(after)
relation = relation.where(
'(created_at, id) < (?, ?)',
cursor[:created_at],
cursor[:id]
)
end
relation.limit(first)
end
Real-World Example
Complete Implementation
# app/graphql/types/query_type.rb
module Types
class QueryType < Types::BaseObject
include Cursorable
field :posts, Types::PostType.connection_type, null: false do
argument :after, String, required: false
argument :first, Integer, required: false, default_value: 20
argument :category, String, required: false
end
def posts(after: nil, first: 20, category: nil)
relation = Post.order(created_at: :desc, id: :desc)
relation = relation.joins(:category).where(categories: { name: category }) if category
if after
cursor = decode_cursor(after)
relation = relation.where(
'(posts.created_at, posts.id) < (?, ?)',
cursor[:created_at],
cursor[:id]
)
end
relation.limit(first)
end
end
end
Query Example
query GetPosts($first: Int, $after: String, $category: String) {
posts(first: $first, after: $after, category: $category) {
edges {
node {
id
title
content
createdAt
}
cursor
}
pageInfo {
hasNextPage
hasPreviousPage
startCursor
endCursor
}
}
}
Benefits Over Offset Pagination
| Feature | Cursor-Based | Offset-Based |
|---|---|---|
| Performance | Better with large datasets | Degrades with large offsets |
| Consistency | Handles data changes | Can skip/duplicate items |
| Scalability | Excellent | Poor at high offsets |
| Implementation | More complex | Simpler |
Best Practices
- Use composite cursors: Include multiple fields (timestamp + id) for stability
- Index properly: Ensure database indexes on cursor fields
- Limit page size: Keep
firstreasonable (10-50 items) - Handle edge cases: Deal with invalid cursors gracefully
Database Indexing
Ensure proper indexes for performance:
# db/migrate/xxx_add_indexes_to_posts.rb
add_index :posts, [:created_at, :id], order: { created_at: :desc, id: :desc }
Error Handling
Handle invalid cursors:
def decode_cursor(cursor)
decoded = Base64.urlsafe_decode64(cursor)
created_at, id = decoded.split(':')
{
created_at: Time.parse(created_at),
id: id.to_i
}
rescue ArgumentError, TypeError => e
raise GraphQL::ExecutionError.new(
"Invalid cursor: #{cursor}",
extensions: { code: "INVALID_CURSOR" }
)
end
Conclusion
Cursor-based Relay-style pagination in GraphQL-ruby provides an efficient, scalable way to paginate through data. By using connection types and properly encoding/decoding cursors, you can implement pagination that performs well even with large datasets and handles data changes gracefully.