Funny, I was just reviewing the react-admin repo this morning. We have a large internal tool that has been in beta for a couple years that is using Sveltkit and are planning a full rewrite for the final version. As part this, we will also be splitting out the field data capture portion of this into a native mobile app. Where are they exploring the possibility of using React (vanilla with react router, Tanstack, or Next.js) as react native is on the list for the native app. One thing that concerns me is the snappiness of front end. In our current implementation, with mostly client side databases access (supabase-js), the app is extremely fast. I know from experience react applications, especially in Next.js are noticeably slower. What can you tell me honestly about your experience with react-admin in regard to page load/hydration speeds?
https://preview.redd.it/e3bl5c9b0pmg1.jpeg?width=1179&format=pjpg&auto=webp&s=c04211bb120d846fc91efac7ed501c0677177ea8 If there is an issue, I don’t think it’s just the east, I West Coast users reporting slow uploads to buckets. This is a 10 MB video that would normally take a few seconds tops. Been like that all morning.
self literally = on your own Just use Coolify or some other one click deployment. Could the documents be better, absolutely. Are they obligated or motivated to do that, no.
Very cool! If you scale definitely look into broadcast from db. https://supabase.com/blog/realtime-broadcast-from-database
Solid, leaning on Supabase's built in auth context to automatically secure and manage your serverless endpoints without custom middleware is an efficient architecture for a solo developer. Since you abandoned streaming entirely to wait for structured JSON, aren't you concerned that the combined AI response delay and those Edge Function cold starts will feel unresponsive and frustrate mobile users? (Insert world class captivating animations here.) To bypass React Native's notorious streaming limitations and deliver that realtime typing effect to your users, you can import web-streams-polyfill in your entry file and explicitly pass { reactNative: { textStreaming: true } } in your fetch options. It’s easy and just works. Unless that server processing and structured JSON is not something you can get away from.
It’s definitely a unique solution and Patreon is well known. If it fits with your model, maybe it’s the niche that gets you a foothold where a typical subscription payment set up would not! Props for thinking outside the box. That said when you’re ready for Stripe, I know it can be a bit overwhelming because they offer so much but their documentation is best in class when it comes to payment platforms IMO. Just dig in, and use the extension as well. Oh, and don’t forget that the Stripe Superbase integration can give you a big head start. And obviously, Stripe is not the only option out there, the Fintech is scene is growing at an exponential pace. Edit: fix a bunch of voice to text garbage
One does not simply ”interact with Supabase” without reading the manual, at least the first page or two. ;)
Ducktape and WD-40 gets the job done
Custom claims JWT custom_access_token_hook Google that and read up.
You can absolutely do this in local development using a second local instance and a testing data seed.sql file. You can spin it up without interfering with your current local CLI instance by following this Gist - [https://gist.github.com/ThingEngineer/27de580744b73a382d1832fde5423a56](https://gist.github.com/ThingEngineer/27de580744b73a382d1832fde5423a56) You could also just do something like this if you are ok stoping and starting your one instance. #!/bin/bash # test.sh +x set -e set -o pipefail sb db dump --local --data-only --schema auth,storage,public \ --exclude storage.buckets \ --file supabase/seed.sql supabase/local_bk_seed.sql cp testing_bk_seed.sql seed.sql supabase db reset pnpm test:run # (or whatever) # These will only run if all preceding commands succeeded cp local_bk_seed.sql seed.sql supabase db reset echo "all good bro" You can do something similar in your GitHub actions CI with branching or with a free Supabase project if your project can fit on one. Edit: sorry sounds like you are already doing that.
Second that. I do wish the posts were dated (created/updated)!!! https://supabase.com/docs/guides/getting-started/quickstarts/sveltekit Edit: I know documentation is time consuming and docs are not always up to date, that’s why this is so important.
Curious, do any of the Statuspage statuses relating to your upstream partners get updated automatically via integrations? We’ve been looking at automating ours like that, but there is a lot to consider.
That said, implementing the requirements to ensure your service’s availability, processing integrity, confidentiality, and privacy might not afford you the option of staying in the cheap/low price bracket. You’ll also have to provide integrations for third-party clients using your service who want to be SOC2 compliant, since their own audits will require it. The initial/yearly cost of your own audit is just tip when you’re providing a service like this. It does help you appreciate the costs that providers have to pass along for a service like this. SB Teams comes with a lot of other perks too.
And the latter is typically the case, given that most applications integrate with other third parties for the highly sensitive things like storing financial data (like account numbers that are tokenized). That and there aren’t too many critical infrastructure providers asking questions on here. But you are correct, once you talk to your certification partners about your platform, type of data you are storing, and work processes, etc. they are quickly able to provide you with a price that matches the scope of your company and application.
Vanta is really not bad. They will work with you on pricing for your readiness and audit (package deal). Source: currently a Vanta customer
You’re right that the token is exposed in the HTML. This is a deliberate trade-off Supabase makes to prioritize the Public Client model. Answering your questions as whole rather than each one individually... Why: Supabase is designed so the browser can talk directly to your database and Realtime engine. For that to work, the client-side JavaScript needs the token immediately. If you used strictly httpOnly cookies, the JavaScript would be "blind" and you’d have to proxy every single request through your own server. Why it is (mostly) safe: Supabase uses Refresh Token Rotation. If an attacker steals that token from your HTML and uses it, the moment the legitimate user’s browser tries to use that same token, Supabase detects the "reuse" and immediately kills the entire session for everyone. This prevents long-term account hijacking. How to "fix" it if you want higher security: 1. Change your Load function: In `+layout.server.ts`, return only the `user` object (or just the ID) instead of the full session. 2. Toggle Cookies: Set your auth cookies to `httpOnly: true` in your Supabase configuration. 3. The Catch: You will no longer be able to use `supabase.from()...` inside your `.svelte` components. You will have to do 100% of your data fetching on the server in `+page.server.ts` which is the normal pattern anyway if you are not using the client side Supabase API. If that is the case you would also want to disable the public data API in your Supabase project. Clear as mud?
I always use them for Edge and Postgres functions and have done so in production for years now. It’s just pgsodium which is used in production for modern, high-performance cryptography in PostgreSQL. It intern uses libsodium which is 13 years old. One thing to note is that pgsodium is pending deprecation and Vault will be shifting away from it but the Vault API will remain stable. Wrappers on wrappers on wrappers, like everything else. https://supabase.com/docs/guides/database/extensions/pgsodium What are you using to protect your secrets now? Edit: as far as limitations, there are some, but I don’t know if I would call them limitations more than I would situational functional. Obviously it's for at-rest encryption of secrets within the database, and does not provide true end to end client side encryption. There were INSERT statement logging issues, but I believe those have been fixed. Actually, I can think of one limitation, but it only applies to local development and it’s more of an annoyance really. When you add your secrets to the config.toml for local cli development and testing, each time you push your db migrations ‘supabase db push’ a lowercase version of those secrets are pushed to your production vault. There is no option to disable this. Why the heck would you push development secrets to production anyway?!!! I’m sure this will eventually get fixed, but it still does it today. I guess I’m just glad that it doesn’t push the uppercase version because that would overwrite the production secret.
No.
Yes please!!
You're very welcome! Honestly I like this better (edit: I just thought you wanted to say in the Sb environment) and I share your sentiments about full control in a custom API vs mixed micro features in a (very cool) but monolithic environment like Supabase. A lot of times it makes so much more sense to do the easy, medium or hard thing on your own terms in a flow that you feel comfortable with. Supabase has come a long way in the last few years and they are making some big strides forward, however, there is not doubt that about the presence of growing pains. Best of luck on your project(s)!
Sure, and try this out. [https://micro-mes.lovable.app](https://micro-mes.lovable.app) The stations and TV are simulated in separate cards. I added a few more stations but that's easy to configure, could even make it configurable dynamically to allow for growth, on the fly changes or one off process setups. Edit: This took about 20 minutes. Had to take the trash out, feed the cat, and got some ice cream before starting after my first post. Edit 2: I connected this to a Supabase Pro project but decided to leave it using mock data for now as I would rather look over the RLS myself despite it passing Lovable's security scan and not using public schema. If you would like the PRD and prompt I built to feed into lovable it's yours. I'll won't be able to send it on here though it's too large.
Short answer: Totally doable, but keep the workflow/state logic and heavy calculations in Supabase/Postgres and use Lovable to generate the UI and simple client-side glue. That gives you correctness, performance, and avoids “AI spaghetti”. Let me know if you want the long answer.
It was still to big for a post so here is a Pastebin: [https://pastebin.com/1e8LvFy1](https://pastebin.com/1e8LvFy1)
The "Async" flag pattern is fine if you absolutely must write directly to your DB from the app (you don't want to refactor to call an Edge Function first). Using RLS to block access to non sanitized rows: CREATE POLICY "Only show sanitized" ON reviews FOR SELECT USING (is_sanitized = true); But IMO I still think the proxy pattern is the most robust and safest because it moves the CPU intensive work and latency completely off your database. The database never sees dirty data, so you don't need RLS policies to hide it. \- DO NOT create an INSERT policy for 'authenticated' or 'anon' roles. This prevents the client (normal users) or a bad actor from posting directly to the reviews table. They'll just get a "new row violates row-level security policy" error if they try to insert directly. \- Client sends the dirty HTML to a Supabase Edge Function, like 'submit-review'. \- Edge function: \- Handle CORS Preflight Request (required if calling from a browser) \- Instantiate an Auth Client (scoped to the user) by passing the Authorization header from the request. Call getUser() to verify the token is valid and get the user's ID. \- Parse the request body to get the dirty HTML \- Run a Deno-compatible sanitization library (sanitize-html, ammonia, etc. note: isomorphic-dompurify won't work in Deno) \- Instantiate a second client using the Service Role Key (bypasses RLS) \- Insert the clean data using [user.id](http://user.id) from the auth check (NOT from the request body to prevent spoofing) \- Database receives purely clean data. No triggers needed, no DB CPU usage, no other points of failure. I'll post a real example function doing this for blog posts, it's probably to long to include in this post.
Wish there were a Postgres extension for this exact scenario. Blocking the record from being read until the non-blocking sanitation is finished.
Yes, always poll unless realtime is critically required. (Edit: or self host)
With that I would suggest finding a vendor that does what you need that is already GDPR compliant. Maybe Supabase will make that available at some point.
With that I would suggest finding a vendor that does what you need that is already GDPR compliant. Maybe Supabase will make that available at some point.
So, would it be easy to make Supabase GRPR complaint if you self hosted it? What is the list of necessary changes/additions and how would you implement these missing requirements?
So, would it be easy to make Supabase GRPR complaint if you self hosted it? What is the list of necessary changes/additions and how would you implement these missing requirements?
Nice, congrats on the launch! Did you use Expo, npx create-expo-app, the original [create-t3-turbo](https://github.com/t3-oss/create-t3-turbo) repo or something else? It's been a while since I've used React Native and have not used Expo before but considering it.
You’ve got one already and it is easy to setup, pgmq. I’ve used it in a similar situation as what you described and in an email queue system by adding pg_cron to the mix. https://supabase.com/docs/guides/queues
This. Edge functions have there limitations. Aces in places.