Hacker News new | past | comments | ask | show | jobs | submit | gaploid's comments login

Using ChatGPT and AI assistants over the past year, here are my best use cases:

- Generating wrappers and simple CRUD APIs on top of database tables, provided only with a DDL of the tables.

- Optimizing SQL queries and schemas, especially for less familiar SQL dialects—extremely effective.

- Generating Swagger comments for API methods. Joyness

- Re-creating classes or components based on similar classes, especially with Next.js, where the component mechanics often make this necessary.

- Creating utility methods for data conversion or mapping between different formats or structures.

- Assisting with CSS and the intricacies of HTML for styling.

- GPT4 o1 is significantly better at handling more complex scenarios in creation and refactoring.

Current challenges based on my experience:

- LLM lacks critical thinking; they tend to accommodate the user’s input even if the question is flawed or lacks a valid answer.

- There’s a substantial lack of context in most cases. LLMs should integrate deeper with data sampling capabilities or, ideally, support real-time debugging context.

- Challenging to use in large projects due to limited awareness of project structure and dependencies.


Hey Hey,

We're curious about your thoughts on Snowflake and the idea of an open-source alternative. Developing such a solution would require significant resources, but there might be an existing in-house project somewhere that could be open-sourced, who knows.

Could you spare a few minutes to fill out a short 10-question survey and share your experiences and insights about Snowflake? As a thank you, we have a few $50 Amazon gift cards that we will randomly share with those who complete the survey.

Thanks in advance


As per next steps: We plan to test multimodal ChatGPT with image data, perhaps passing the full screenshot of a dashboard with different charts, to improve the model's contextual understanding. As the main constraint when implementing with raw data is the prompt length, data displayed in a visual format may be more condensed and compact.


The article about of importance to use the right technology for your workloads and think for 1-2 years further of your project


Why are you using 'threads' instead of vcpus or aws instances like it was for other benchmarks? Thats really hard to compare and add suspicions here.


It is related to the "max_threads" setting of ClickHouse, and by default, it is the number of physical CPU cores, which is twice lower as the number of vCPUs.

For example, the c6a.4xlarge instance type in AWS has 16 vCPUs, 8 cores and "max_threads" in ClickHouse will be 8.


We built a Managed ClickHouse service to help exactly these difficulties with that technology. We are handling sharding, clustering, zookeeper, patching, updates without downtime, and Hybrid storage based on S3. https://double.cloud


How many Clickhouse as a service offerings exist now? I stopped counting at 7 a few months ago (double.cloud was not on my list).


Could you share a list of them?


  - Firebolt (Hard fork of clickhouse)
  - Altinity
  - Gigapipe
  - Hydrolix
  - Bytehouse.cloud
  - https://clickhouse.com/ ("coming soon")
  - TiDB (Their columnstore is a fork of clickhouse)
I stopped tracking after this. I saw a few press releases go by announcing a few others as well which I lost now.

The official Clickhouse Inc. is surely going to be under pressure to pull features out of their open source offering over time to differentiate themselves.


Great question! 1. We tried to build a platform covering end-to-end analytics scenarios with core services based on open-source, and Kafka is an essential block in such solutions. 2. Using them on the same platform eliminates things like traffic between accounts and traffic between AZs and solves security concerns. 3. We are also adding things that help integrate each of these blocks and help users to start using open-source technologies faster.


CDC is our secrete and currently free feature:)


Hi everyone, I'm Victor and Product Lead at https://double.cloud if you have any questions feel free to fire them to me.


Thanks guys for making ClickHouse available as a managed service


Sure, we built it with "bells and whistles": Tools for visualization with native support for ClickHouse, lightweight ETL service, and many other things like backups, monitoring, and logs. We even support ClickHouse over S3 when data could be decoupled from computing to S3, giving a performance of Clickhouse and cost of S3 storage.


That's something similar to DynamoDB and CosmosDB. I believe the workaround is to have paging on client side to fetch all results.


Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: