We’re serious about security, privacy, and ethical data use

Not used to train AI models

Data uploaded to WhyHive won’t be used to train third-party AI models like GPT. You control when and if your data is shared with others, and you can delete your data at any time.

Encrypted by default

We use AES-256 to encrypt data while it’s on our servers. This is the same encryption standard used in many industries that handle sensitive information, like healthcare and banking.

Data stored in Australia

We store user and project data on secure AWS servers located in Australia. AWS is the same cloud infrastructure used by companies like Canva, Netflix and Meta.

FAQ

Data storage & access

  • WhyHive stores user data, project data and uploaded data on AWS cloud servers located in Australia. This data is encrypted while it is stored. AWS is used by leading software businesses like Canva, Netflix and Meta. You can read more about AWS security and compliance here.

    Data that is analysed by AI is stored temporarily for 30 days in Australia by Microsoft for abuse detection purposes.

  • No. The WhyHive app interfaces with GPT models via Microsoft’s Azure OpenAI Service (more about this in the ‘AI privacy and security’ section below). These are AI models developed by OpenAI but hosted on Microsoft’s cloud infrastructure.

    To detect and mitigate abuse, Microsoft stores all prompts and generated content securely for up to thirty (30) days. These prompts and generated content are stored on secure Microsoft services in Australia.

  • Your projects and data are private to you. A user or someone with a project link cannot access the project or see its underlying data unless they are logged in with the account that was used to create the project.

    If another WhyHive user attempts to open a project that does not belong to them, they will not be able to see or access any project information. This includes charts, queries, preferences and the underlying data.

    Note that certain members of the WhyHive development team have access to user-uploaded data for troubleshooting and debugging purposes.

AI privacy & security

  • WhyHive uses AI for a number of purposes including:

    • Cleaning and processing data.

    • Recommending questions you can ask about your data.

    • Interpreting questions you ask about your data.

    • Text analysis.

    WhyHive does not use AI to make charts or populate them. As such, there is no risk of AI making up new data that is fake or incorrect (known as ‘hallucination’).

    The only time AI generates new data is when you use the text analysis feature, where you have control over when AI is used to derive and assign AI-generated themes. Data created in this way is clearly labelled and separated from the original data.

  • WhyHive does not use AI to visualise your data. This means that when you see a chart in WhyHive, the underlying data can only real data in your dataset. In other words, there is no risk of AI making up new data that is fake or incorrect (known as ‘hallucination’) and then showing this in a chart without you knowing.



    This is different to ChatGPT and similar tools that give AI responsibility over the entire data analysis process. As a result, these tools can perform analysis that is unintended and/or obscured from the user, leading to inacurrate or misleading results.

  • No, data that you upload is not used to train AI models. The WhyHive app interfaces with GPT models via the Azure OpenAI Service. These are AI models developed by OpenAI but hosted on Microsoft’s cloud infrastructure. By default, any data that is sent through the Azure OpenAI Service is not used to train AI models. You can read about the Azure OpenAI Service’s data, privacy, and security policies here.

  • No. For data to turn up in someone else’s ChatGPT conversation it would need to have been ingested into the training dataset for ChatGPT, and any data that WhyHive sends to Microsoft via the Azure OpenAI Service is not available to OpenAI and cannot be used to train their AI models.

    In other words, we interface with a version of GPT that is entirely hosted and controlled by Microsoft. None of the data we send to this version of GPT is available to OpenAI, and none of it is used to train AI models.

    You can read about the Azure OpenAI Service’s data, privacy, and security policies here.

Deleting data

  • Yes. You can delete a project from the app homepage. This will permanently delete the data and all of the associated charts for the project from the homepage.

    You can delete your account and all associated project data by contacting our team.

  • Yes, you can edit the data you upload by removing columns or changing the values of individual cells.

Got more questions?
Email info@whyhive.com

Ready to try WhyHive?